• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3070 Founders Edition

Joined
Apr 6, 2011
Messages
703 (0.14/day)
Location
Pensacola, FL, USA, Earth
My current GTX 1070 card is worried it will be getting replaced with a 3070 card by Jan/Feb, if stock holds up then. :p
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.96/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
At least the power consumption is reasonable on this card, the 3070 isn't a space heater

seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.51/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
At least the power consumption is reasonable on this card, the 3070 isn't a space heater

seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.

Thank you Samsung...
 

ixi

Joined
Aug 19, 2014
Messages
1,451 (0.39/day)
Another fraudy release
Not available in market
If available, would be with a much higher price than the announced one.
If Nvidia does what it did with 3080 it should be sued (unfortunately only people in US can do that)


Except the actual price would much much higher than the announced 500
At my country cheapest 620, then 650 and then it goes over 720euro, nice right... not to mention that it is not still released...
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Last time, they dropped the price by $50 on the day before release day .... which can only be seen as raising the performance white flag. You don't lower prices when you have a better product you raise them

AMD announced pricing on 5700 series.
NV countered with s (bigger and more expensive to produce) versions of the cards.
AMD adjusted pricing on it's GPUs with chips of "whopping" 250mm2 size.
"white flag" is fanboi imagination.

Thank you Samsung...
Thank you Jensen to OC-ing the card to max.
 
Joined
Sep 17, 2014
Messages
22,358 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Hold your breath and crying? None of which is happening here. Grow up and talk like an adult.

Of course games will work with 8GB, that isn't the point. I like how you just missed the entire 770 to 1070 we got 4x more VRAM and yet 1070 to 3070 we didn't get any more at all. And I'm unreasonable? We've never gone 4 years without more VRAM. And this during a launch year of a new console driving up VRAM requirements very soon. More importantly, build it and they will come, give all your cards more VRAM and game makers will release content for them. The month the 1070 launched I suddenly had games with new video options and texture packs that used more than 4GB and the same people were "crying" like yourself that we didn't need 8GB then, and yet voila we got better settings than console games and used the VRAM up quickly. Doom Eternal already needs more than 8GB for top settings and more games will soon.

NO man, you have to understand, you buy 4K capable cards to game at 1440p with High settings. And that's fine for 700-800 dollars worth of GPU nowadays. Can't be nitpicking too much about 10GB right? Look at the god damn EF PEE ES for the money! This card is fine.

And texture packs?! Console don't get that either, stop whining.

Practical experience with more VRAM usage? No, you're lying ;) Its not commonplace at all. I don't see it myself every day either. Allocated is not 'in use'! Its fine! Stutter doesn't exist!

/s
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
This review, in fact 95%+ of reviews, when manufacturers send in cards to be tested, has it ever been tested how close to "typical" the tested cards are?

In CPU world we have the "silicon lottery", don't we have the same in GPU world?

Imagine you are manufacturing graphic cards. Would you take a bunch of them, benchmark, and pick up the best to be sent in for reviews?


The only case that I could imagine when this wouldn't be a problem, if fluctuations are really small, like 1-3%.
 
Joined
Apr 12, 2013
Messages
7,483 (1.77/day)
Thank you Samsung
We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.
 
Joined
Apr 30, 2011
Messages
2,700 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
We don't really know if Samsung is at fault or Nvidia pushed the cards way past reasonable limits, not unlike AMD, or indeed the Ampere uarch. The only way we'd know for sure is if Nvidia launches the exact same cards on TSMC 7nm or any other "better" node.
Samsung's 8nm node isn't made for big dies and thus, when pushed to higher clocks than their optimum they draw too much power. And this time nVidia had to push for high clocks to keep the GPU crown (today we should learn for sure me thinks). And it is obvious that this Ampere arch is compute focused and doesn't like high clocks either. So, Ampere on Samsung was a bad combo. Pascal could work better if made on this but Ampere and Turing not so well with big dies and tensor cores.
 
Joined
Jan 8, 2017
Messages
9,407 (3.29/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
seeing how low down the power consumption for my 1080 is compared to these new cards is a shock, they've really regressed on that.

So it is a space heater then.
 
Joined
Oct 10, 2018
Messages
943 (0.42/day)
You can undervolt 3080 to 200w levels and lose about 5% performance. So that means as stated before these cards are great to a certain point then lose efficiency.
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
You can undervolt 3080 to 200w levels and lose about 5% performance. So that means as stated before these cards are great to a certain point then lose efficiency.
If you get a good sample you might get there. But ~35% less power with that little performance loss sounds rare. Still, GPU undervolting is extremely valuable with these power hungry cards.

This review, in fact 95%+ of reviews, when manufacturers send in cards to be tested, has it ever been tested how close to "typical" the tested cards are?

In CPU world we have the "silicon lottery", don't we have the same in GPU world?

Imagine you are manufacturing graphic cards. Would you take a bunch of them, benchmark, and pick up the best to be sent in for reviews?


The only case that I could imagine when this wouldn't be a problem, if fluctuations are really small, like 1-3%.
Some reviewers have done things like that, and some reviewers purposely buy products off the shelf for review to control for this - for example, GamersNexus often buys GPUs for testing rather than getting them from the OEM. That obviously isn't possible for pre-release reviews, but in general the variance is essentially zero. Given the relatively loose voltages and aggressive boosting algorithms on GPUs, you won't see much of a difference unless you're undervolting - and no OEM would be dumb enough to send out a pre-undervolted review sample, even if it would technically be possible.
 
Joined
May 13, 2016
Messages
88 (0.03/day)
Well, I'm thoroughly surprised about the performance. I expected it to only deliver "2080 Ti performance" with RTX on and DLSS 2.0 in a few select games. But damn, this is not bad, very intriguing GPU.
It's a shame it is not going to be available for a while. I haven't seen a single 3080 in any Hungarian store yet since release and I have a feeling the 3070 is going to be even worse as I expect the demand to be higher.
Maybe in 3-4 months the 30-series are going to be actually "released" and available.
 
Joined
May 29, 2017
Messages
342 (0.13/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case XPG Starker AIR
Audio Device(s) Airpulse SM200 + Volt 1
Power Supply Cougar GEX 750
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 10 Pro
Well, I'm thoroughly surprised about the performance. I expected it to only deliver "2080 Ti performance" with RTX on and DLSS 2.0 in a few select games. But damn, this is not bad, very intriguing GPU.
It's a shame it is not going to be available for a while. I haven't seen a single 3080 in any Hungarian store yet since release and I have a feeling the 3070 is going to be even worse as I expect the demand to be higher.
Maybe in 3-4 months the 30-series are going to be actually "released" and available.
RTX 3070 overall is good but GTX 970 and GTX 1070 was better when they launched. Hype every year increases and people for some reason cannot think clearly.

That's probably because of excellent nvidia marketing.
 
Joined
Apr 9, 2020
Messages
308 (0.18/day)
Thank you Samsung...
Why Samsung? In this day and age of MCU design you design a chip around the intended node and it's limitations and not the other way around.
MCUs are pushed to the brink of stability with these ultra high 2GHz boost clock frequencies.
 

Tebbs.

New Member
Joined
Oct 28, 2020
Messages
2 (0.00/day)
Wondering why Fortnite is not used in testing cards?
One of the most played games, and I for one would base my purchase on results for it.
 
Joined
Dec 31, 2009
Messages
19,371 (3.57/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Wondering why Fortnite is not used in testing cards?
One of the most played games, and I for one would base my purchase on results for it.
Same reason most MP games aren't used... becuase testing consistency is nearly impossible to be consistent on MP games. Besides, Fortnite runs on a potato in the first place.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,742 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Same reason most MP games aren't used... becuase testing consistency is nearly impossible to be consistent on MP games. Besides, Fortnite runs on a potato in the first place.
That, also patches can occur at random times, so I can't guarantee a consistent game version between test runs
 
Joined
Mar 24, 2012
Messages
533 (0.12/day)
Samsung's 8nm node isn't made for big dies and thus, when pushed to higher clocks than their optimum they draw too much power. And this time nVidia had to push for high clocks to keep the GPU crown (today we should learn for sure me thinks). And it is obvious that this Ampere arch is compute focused and doesn't like high clocks either. So, Ampere on Samsung was a bad combo. Pascal could work better if made on this but Ampere and Turing not so well with big dies and tensor cores.

i don't think nvidia push for high clock with their ampere design. if they were then we probably already seeing something beyond 2.2ghz. starting with turing nvidia try looking for something else than increasing the clock speed to increase their performance. also while samsung node are not made for big die size one way or another nvidia have to "force" samsung to make their big GPU. else how samsung going to get the experience if they never make one?
 
Joined
May 31, 2016
Messages
4,437 (1.44/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.
 
Joined
Jan 31, 2011
Messages
2,210 (0.44/day)
System Name Ultima
Processor AMD Ryzen 7 5800X
Motherboard MSI Mag B550M Mortar
Cooling Arctic Liquid Freezer II 240 rev4 w/ Ryzen offset mount
Memory G.SKill Ripjaws V 2x16GB DDR4 3600
Video Card(s) Palit GeForce RTX 4070 12GB Dual
Storage WD Black SN850X 2TB Gen4, Samsung 970 Evo Plus 500GB , 1TB Crucial MX500 SSD sata,
Display(s) ASUS TUF VG249Q3A 24" 1080p 165-180Hz VRR
Case DarkFlash DLM21 Mesh
Audio Device(s) Onboard Realtek ALC1200 Audio/Nvidia HD Audio
Power Supply Corsair RM650
Mouse Rog Strix Impact 3 Wireless | Wacom Intuos CTH-480
Keyboard A4Tech B314 Keyboard
Software Windows 10 Pro
Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.
I can see it already consuming almost similar power as the RTX 3080 even with some shader cores disabled as it's using the same die
 
Joined
May 2, 2017
Messages
7,762 (2.82/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Not bad but look at the difference in performance between the 3080 and the 3070. I can bet there will be 3070 Ti. That would be a very good card if the price is OK.
If I'd go for NV card I'd wait for the 3070 Ti.
$200 price gap, 100W power consumption gap, 23% performance gap at 1440p? Yeah, that does open the door for a $600 3070 Ti, as long as there are enough sufficiently defective GA102 dice to make that viable as a product. Though it would also make for a rather crowded product stack, with AIB 3070s easily exceeding the price of a base 3070 Ti, making developing such cards a lot less attractive for partners. Going by recent history we might instead see a 3070 Super in a while fulfilling the same role while replacing the 3070 outright or pushing its price down a bit.
 
Joined
May 31, 2016
Messages
4,437 (1.44/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I can see it already consuming almost similar power as the RTX 3080 even with some shader cores disabled as it's using the same die
Well I was focusing mostly on the performance difference between the two than how the power consumption look like.
If you compare the 3080 and 3090 the difference is not that significant and yet there rumors about the 3080 Ti being stuck between the two. It would have made more sense to have a 3070 Ti to leverage the performance range.

we might instead see a 3070 Super in a while fulfilling the same role while replacing the 3070 outright or pushing its price down a bit.
Actually I don't care about the name of the card Ti or Super or whatever. The point is that the gap in performance is huge between the two. It would have made more sense to make a 3070 ti/super than a 3080 Ti with a 12% performance gap(or 15% whatever it is) between the 3080 and 3090
I don't think the replacement is a great option here. The gap is huge. I'm not convince also that the replacement will attract more people actually contrary. People want the cards now, not when they already have the 3000 series whatever card they buy.
 
Last edited:
Joined
Sep 8, 2020
Messages
214 (0.14/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
In my country you can't find one and even if you are lucky at some point it costs around 840$
If AMD doesn't put out something good we are looking at a future where having a decent GPU will cost an important sum of money.
 
Top