• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3070 Founders Edition

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,964 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Nice Review! But how was the experience with coil whine, if any?
I checked again for you. If I run over 200 FPS and put my ear right next to the card I can hear a little bit of coil whine, but it's minimal, similar to other cards, not worth mentioning.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?
The key question is, if it is a real release, or a la 3080, to trick AMD to price own cards lower, while NV would shrug off "oh, no availability, sorry" until they have an actual answer to Big Navi.
 
Joined
Apr 10, 2020
Messages
504 (0.29/day)
Quite obviously to steal some thunder. AMD can't 'pop more vram' on the card a day before. AMD's card is what it is on that front. Surely clocks are being tweaked, but yeah.
I mean 3070 is such an easy target for AMD to beat. Adding 4GB extra GDDR6 costs 20 bucks at most and as we've seen from reputable leaks RDNA2 can hit nearly 2.5Ghz now. All AMD has to offer is "X Box X's" 52CU (less than 300 mm2 die size) GPU clocked at 2.2Ghz (PS5 gaming clock speed) to marginally beat 3070 in standard rasterization performance. I'd expect Nvidia to wait for 6700XT announcement and then tweak and release 3070 accordingly. I think Nvidia will lose xx70 battle this time around if it can't release competitive 3070 TI soon after 6700XT launches.
 
Last edited:
Joined
Dec 31, 2009
Messages
19,371 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I mean 3070 is such an easy target for AMD to beat. Adding 4GB extra GDDR6 costs 20 bucks at most and as we've seen from reputable leaks RDNA2 can hit nearly 2.5Ghz now. All AMD has to offer is "X Box X's" 52CU (less than 300 mm2 die size) GPU clocked at 2.2Ghz (PS5 gaming clock speed) to marginally beat 3070 in standard rasterization performance. I'd expect Nvidia to wait for 6700XT announcement and then tweak and release 3070 accordingly. I think Nvidia will lose xx70 battle this time around if it can't release competitive 3070 TI soon after 6700XT launches.
Not the goal I shot at... but ok.
 
Joined
Apr 23, 2015
Messages
82 (0.02/day)
From GUru3d:

Much like the 3080, the GeForce RTX 3070 does exhibit coil squeal. Is it annoying? It's at a level you can hear it. In a closed chassis, that noise would fade away in the background. However, with an open chassis, you can hear coil whine/squeal. Graphics cards all make this in some form, especially at high framerates; this can be perceived.

Feel like a lot of words to say nothing, plus it's false that coil whine fade away into background noise which has lower frequencies. It's more or less what you expect from an high end card? reading into words, as a reviewer myself, i think it's noticeable and that's not good.

I checked again for you. If I run over 200 FPS and put my ear right next to the card I can hear a little bit of coil whine, but it's minimal, similar to other cards, not worth mentioning.
Guru3d mention felt like a worse problem. Thanks for your second check, i'll trust you.
 
Joined
Mar 18, 2008
Messages
5,717 (0.93/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Thank you for the detailed review, Nice GPU.. 3070 is truly a nice beast.
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
I really don't get Nvidia timing this time around. Why show 3070 a day before RDNA2 reveal? AMD will surely have an answer for 3070 in RDNA2 lineup tomorrow. All it has to do is to price it a bit cheaper and pop more vram on top of it to make 3070 look silly. What am I missing?

1. RTRT performance. AMD will be a lot slower.
2. DLSS (read: free performance, tier higher performance than your card is actually capables of). Just missing.
 
Joined
Dec 6, 2016
Messages
748 (0.25/day)
@W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,964 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
@W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?
It's half of AMD ?

Guru3d mention felt like a worse problem. Thanks for your second check, i'll trust you.
Could be variance between samples
 
Joined
Mar 2, 2019
Messages
181 (0.09/day)
System Name My PC
Processor AMD 2700 x @ 4.1Ghz
Motherboard Gigabyte X470 Aorus Gaming
Cooling Zalman CNPS20X
Memory Corsair Vengeance LPX Black 32GB DDR4
Video Card(s) Sapphire Radeon RX 570 PULSE
Storage Adata Ultimate SU800
Case Phanteks Eclipse P500A
Audio Device(s) Logitech G51
Power Supply Seasonic Focus GX, 80+ Gold, 550W
Keyboard Roccat Vulcan 121
Since Nvidia stopped shipping 3080 and 3090 Founders edition - probably the 3070 will be only IT influencers.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
2,039 (0.35/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case Cooler Master QUBE 500 Flatpack Macaron
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Keychron K2 HE Wireless / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Meta Quest 3 512GB
Software Windows 11 Pro 64-bit 24H2 Build 26100.2605
@W1zzard : why don't you mention the very high multi monitor power draw in the cons of your Ampere reviews, because compared to Turing or Pascal its significantly worse? AMD gets constantly bashed for this metric, but nVidia gets a free pass?

At least with the NVIDIA cards its not running the VRAM at max clocks. For some reason running multi-monitor (and even for some single monitors at 144 Hz) causes the VRAM to stay maxed with the AMD cards.
 
Joined
Apr 10, 2020
Messages
504 (0.29/day)
1. RTRT performance. AMD will be a lot slower.
2. DLSS (read: free performance, tier higher performance than your card is actually capables of). Just missing.
1. RT really doesn't matter for now and SP5/XboX X will have RDNA chips in them so expect AMD's implementation of ray tracing to rule gaming market with only exception being Nvidia sponsored titles.
2. Yeah, AI upscaling could become a valid advantage point of Ampere IF DLSS can be implemented on GPU driver basis rather than game code as not many devs will walk extra mile to add it into the code without financial compensation for doing so I'm afraid :(
 
Joined
Apr 9, 2020
Messages
309 (0.18/day)
The Ampere SM has 128 CUDA cores however only 64 of them can execute FP32 operations concurrent with 64 integer operations in a clock cycle, meaning that in the best case scenario the Ampere SM has twice the shading power (128 FP32 per cycle) of a Turing one and in the worst case scenarios it's about as fast (64 FP32).

In other words the SM has some pretty bad constraints making it less efficient per cycle in terms of FP32.
This is exactly what I was looking for, thank you. So indeed, most games will never benefit from the new SM array it seems as the games seldomly use FP64 int, correct?
 
Joined
Jan 8, 2017
Messages
9,504 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
This is exactly what I was looking for, thank you. So indeed, most games will never benefit from the new SM array it seems as the games seldomly use FP64 int, correct?

Graphics workloads are predominantly mixed between FP32 and INT32. When integer instructions are issued that means the SM can no longer process 128 FP32 operations in that clock cycle. That happens pretty often since you constantly need to calculate addresses which need INT32, so I suspect the SM hardly ever achieves it's maximum FP32 throughput.
 
Joined
Nov 4, 2019
Messages
234 (0.12/day)
W1zzard has mentioned this and this has been disspelled a thousand times already:

No he has not. Next gen games haven't even been released yet and we are on the edge. He said it was good enough for now, which isn't what we should be looking at, but over the next year as PS5 level games get ported to PC. There is also a difference between "enough" and providing the hardware so that game makers can provide better textures packs, and nVidia is holding PCs back here. I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.

4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.
 
Last edited:
Joined
Dec 6, 2016
Messages
748 (0.25/day)
Joined
Oct 18, 2019
Messages
415 (0.22/day)
Location
NYC, NY
There is only ONE Benchmark I'm interested in.

MICROSOFT FLIGHT SIMULATOR 2020 4K (since no one bothers benchmarking DCS-WORLD).

The 3070 is just below the 2080Ti at 30fps (+/- 5 fps)

For $500 vs. $1200, that's pretty good.

But I have the 3090 and I can't even see 60fps in that game.
 
Joined
Dec 31, 2009
Messages
19,371 (3.54/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
No he has not. Next gen games haven't even been released yet and we are on the edge. He said it was good enough for now, which isn't what we should be looking at, but over the next year as PS5 level games get ported to PC. There is also a difference between "enough" and providing the hardware so that game makers can provide better textures packs, and nVidia is holding PCs back here. I was able to use more than 4GB the day the 1070 released because game makers released new options and new texture packs.

4 years from the 770 to the 1070 got us 4x more VRAM. 4 years from the 1070 to the 3070 got us nothing at all. That has never happened before and I think we shouldn't be so glib about it.
It will be OK to hold your breath... seriously.

People need to get over this RAM thing already... so much crying over spilled milk.
 
Joined
Jan 12, 2012
Messages
92 (0.02/day)
A wonderful review, a great card, only I cannot quite accept the fact that it's still priced quite high:

GTX 770 - $399
GTX 970 - $329
GTX 1070 - $449

Lastly it's the most power efficient Ampere GPU but I expected a much higher overall efficiency for this generation. It's only a few percent more efficient than the GTX 1660 Ti based on a much less advanced node. I expect availablily will totally suck for the next at least three months - Ampere cards are so good several generations of AMD/NVIDIA owners are in line to get them. The RTX3080 is still nowhere to be seen: https://www.nowinstock.net/computers/videocards/nvidia/rtx3080/

As a consumer, I also want cheaper products.

But keep in mind that there is not only inflation, but so many other factors in the world that have an influence on these prices (like corona). The 770 was released in 2013, that's 7 years ago. How much more performance do we get out of a 3070 than we did from a 770? No other market can keep up with such major iterations. CPUs, cars, planes, internet speeds... nothing can keep up with the major gains we get every year from GPUs. So at least we can be happy about that :)
 
Joined
Apr 18, 2013
Messages
1,260 (0.30/day)
Location
Artem S. Tashkinov
There is only ONE Benchmark I'm interested in.

MICROSOFT FLIGHT SIMULATOR 2020 4K (since no one bothers benchmarking DCS-WORLD).

The 3070 is just below the 2080Ti at 30fps (+/- 5 fps)

For $500 vs. $1200, that's pretty good.

But I have the 3090 and I can't even see 60fps in that game.

The game is currently extremely poorly optimized and not yet ported to DirectX12 (which is doubly strange since it's published by Xbox Game Studios) - this is why Wizz refuses to review it.
 
Top