• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.
Depends on the rest of your hardware and how noisy your PSU is, but 650W with a 320W GPU should be perfectly fine unless you're running very high end hardware or overclocked. If Ampere is power limited to TDP like Turing, you have 320W for the GPU, plus 20-30 for the motherboard and RAM, ~5 per SSD, 15 per 3.5" HDD, ~5W per couple of fans, 10 per AIO pump, and however much power your CPU needs. For a 65W AMD that is 88W, for a 95W AMD it's 144W, for a 10th gen Intel it's 150-250W depending on the SKU. That's at stock clocks within the boost window (which might be infinite depending on your motherboard). I would add 20% margin on top of that for safety, and at least another 20% if you're overclocking - likely more. Of course it's highly unlikely for all components in the system to draw maximum power at once, and CPUs pretty much never run at 100% while gaming, so there's some extra margin in there based on that too. 650W would as such be rather slim for a system with something like a 10700K or 10900K (my formula ends up at 675W minimum assuming a couple of SSDs and a few fans), but should work fine with a less power hungry CPU or if you undervolt and/or tune the power limits of one of those power hogs.
 
Joined
Jul 18, 2017
Messages
575 (0.21/day)
Their test system was with an i9 10900k. Nvidia seems to be confident that PCIE 3.0 is not a bottleneck. Jensen, what happened? No love for your niece?
 
Joined
Nov 11, 2016
Messages
3,412 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Their test system was with an i9 10900k. Nvidia seems to be confident that PCIE 3.0 is not a bottleneck. Jensen, what happened? No love for your niece?

At 4K there really is no difference between Pcie 3.0 or 4.0
And Nvidia tested them with a 9900K, still doesn't make any difference against 10900K
 
Joined
Jan 11, 2005
Messages
1,491 (0.21/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
surprisingly good pricing from them; i can compare this with a "preemptive strike" in anticipation of amd release...

what is the best part - we, end users , not fanboys, will have good prices from both of them
 
Joined
Apr 9, 2018
Messages
781 (0.32/day)
The RTX 3070 is going to sell like hot tamales.
If Nvidia can satisfy demand at that $499 price point, then I'll take back every time I've ever complained about their Sheriff of Nottingham business strategy.
 
Joined
Mar 28, 2020
Messages
1,755 (1.03/day)
The power usage is crazy. Both the 90 and 80 are above 300W and even the 70 is approaching the territory that used to be reserved for the 80 Ti cards. But then if 3070 is actually faster than 2080 Ti by a good margin, that means power efficiency has been improved with Ampere. BTW where is that 12-pin pcie aux power connector?

I agree the power requirements have gone through the roof. Waiting for official reviews to see how much performance improvement we are getting with this generation. Also I am not convinced the CUDA core count is correct, i.e. the actual physical number of cores may be half of what is advertised.
 
Joined
Feb 21, 2014
Messages
1,390 (0.35/day)
Location
Alabama, USA
Processor 5900x
Motherboard MSI MEG UNIFY
Cooling Arctic Liquid Freezer 2 360mm
Memory 4x8GB 3600c16 Ballistix
Video Card(s) EVGA 3080 FTW3 Ultra
Storage 1TB SX8200 Pro, 2TB SanDisk Ultra 3D, 6TB WD Red Pro
Display(s) Acer XV272U
Case Fractal Design Meshify 2
Power Supply Corsair RM850x
Mouse Logitech G502 Hero
Keyboard Ducky One 2
I wonder if this generation will be majorly power limited, where unlocking TDP will actually have a measurable effect.
 
Joined
Feb 18, 2012
Messages
2,715 (0.58/day)
System Name MSI GP76
Processor intel i7 11800h
Cooling 2 laptop fans
Memory 32gb of 3000mhz DDR4
Video Card(s) Nvidia 3070
Storage x2 PNY 8tb cs2130 m.2 SSD--16tb of space
Display(s) 17.3" IPS 1920x1080 240Hz
Power Supply 280w laptop power supply
Mouse Logitech m705
Keyboard laptop keyboard
Software lots of movies and Windows 10 with win 7 shell
Benchmark Scores Good enough for me
Huh :wtf: not this again :shadedshu:
In some places around the world there is plently of love for the niece, along with the sister and 1st cousin.
 
Joined
Sep 2, 2014
Messages
259 (0.07/day)
Location
Emperor's retreat/Naboo Moenia
System Name Order66
Processor Ryzen 7 3700X
Motherboard Asus TUF GAMING B550-PLUS
Cooling AMD Wraith Prism (BOX-cooler)
Memory 16GB DDR4 Corsair Desktop RAM Vengeance LPX 3200MHz Red
Video Card(s) GeForce RTX 3060Ti
Storage Seagate FireCuda 510 1TB SSD
Display(s) Asus VE228HR
Case Thermaltake Versa C21 RGB
Audio Device(s) onboard Realtek
Power Supply Corsair RM850x
Software Windows10 64bit
once again , nVidia 's magic in the works !!!
Ultra-Hyped from what was announced by Jensen:love:.
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
This is defiantly going to make it tough to actually give RDNA2 consideration from the looks of things though we still don't know how that will be comparatively speaking. It's probably still a bit premature to call this a grand slam by Nvidia, but that pricing is aggressive this time around and exactly what's needed to propel RTRT forward. I am keen to see just how competitive AMD's card's stack up and at what price point structure. I could see this putting a big damper on Intel's GPU ambitions too.
 
Joined
Jul 19, 2016
Messages
482 (0.16/day)
They've left AMD an open goal because they're using Samsung's clearly inferior 8nm process node. TSMC 7nm enhanced RDNA2 with more memory and lower power draw will beat out the 3080 but will lose in RT quite handedly.

I wonder what relative performance means this time. I have a gut feeling that this incredible (roughly ~1.7x compared to 2080S looking at the graph) speed-up it's all about raytracing and not so much for rasterization but I would love to be wrong here.

It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

3080 25% faster than 2080 Ti.
3090 45 faster than 2080 Ti.
 
Joined
Sep 17, 2014
Messages
22,447 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The RTX 3070 is going to sell like hot tamales.
If Nvidia can satisfy demand at that $499 price point, then I'll take back every time I've ever complained about their Sheriff of Nottingham business strategy.

I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.

I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.

Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.

They've left AMD an open goal because they're using Samsung's clearly inferior 8nm process node. TSMC 7nm enhanced RDNA2 with more memory and lower power draw will beat out the 3080 but will lose in RT quite handedly.



It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

'the' leaks? The 12 pin was the only truly accurate one man (alright, and the pictures then). Nvidia played this well, you can rest assured all we got was carefully orchestrated. And that includes the teasing of a 12 pin. Marketing gets a lead start with these leaks, we also heard 1400- 2000 dollars worth of GPU, obviously this makes the announcement of the actual pricing even stronger.
 
Last edited:
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I thought the 220 watts for the 3070 wasn’t a bad TDP for the performance their advertising.
I think it's a bit much for a 70-class card, but the real problem is the TDP of 3080 and 3090. I think >300W is too much to cool at a reasonable noise level.
 
Joined
Sep 17, 2014
Messages
22,447 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I think it's a bit much for a 70-class card, but the real problem is the TDP of 3080 and 3090. I think >300W is too much to cool at a reasonable noise level.

It is a bit much, basically its a full 104 die's overclocked power consumption. And it is a full 104 as well isn't it? This means that the actual SKUs are still doing what they did, and Nvidia is maintaining its stack in a broad sense. It's just that the 3080 and 3090 have an odd gap for being off the same die, clearly a yield based decision... Its clear GA102 isn't directly a fantastic place to be, if you ask me, there might be some distinct binning differences there.
 
Joined
Oct 6, 2019
Messages
23 (0.01/day)
Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070 +40%
2080S to 3080 +64%

and from Techpowerups 4K benchmarks;

1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
It is a bit much, basically its a full 104 die's overclocked power consumption. And it is a full 104 as well isn't it? This means that the actual SKUs are still doing what they did, and Nvidia is maintaining its stack in a broad sense. It's just that the 3080 and 3090 have an odd gap for being off the same die, clearly a yield based decision... Its clear GA102 isn't directly a fantastic place to be, if you ask me, there might be some distinct binning differences there.
I don't care about which chip they use in which tier, that has changed in pretty much each generation, what matters is how it performs and how much energy it consumes.

I think 220W is a bit much, but tolerable for GTX 3070, but there is a substantial jump up to 320W for RTX 3080, which I think is too hot.

The performance and price gap between RTX 3080 and RTX 3090 probably indicates the production volume of RTX 3090. GTX 1080 Ti and RTX 2080 Ti have been big sellers, even outselling some of AMD's mid-range cards. Time will tell if RTX 3090 will be scarce.
 
Joined
Sep 17, 2014
Messages
22,447 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070 +40%
2080S to 3080 +64%

and from Techpowerups 4K benchmarks https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-tuf-evo/28.html;

1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!

That's why I skipped Turing. SUPER was too late to the party.
 
Joined
Oct 6, 2019
Messages
23 (0.01/day)
That's why I skipped Turing. SUPER was too late to the party.

Yes but we are comparing the old to the new generations, unless the super cards were a generational change?

Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070 +40%
2080S to 3080 +64%

and from Techpowerups 4K benchmarks;

1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!

and for AMD benchmarks at 4K;

RX580 to RX5700XT +100%

and to the future Big Navi (RDNA 2) at 4K;

RX5700XT to RX6900XT +120%???
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Ah yes, the fabled "Big Navi", it does appear to be the second coming
Soundly beating 3070 doesn't sound to me as the second coming... at all?
It could explain NV's suddenly being so modest about pricnig, it's such a great contrast to Turing prices, with 2080Ti never being offered at the claimed $999 MSRP.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.

I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.

Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.
There's no doubt that per-tier pricing has made some major jumps in recent years. Have we gotten more performance at that tier? Sure! But perf/$ has barely been moving at all, at least until RDNA showed up and Nvidia launched the Supers. Turing was essentially "pay the same for the same level of rasterization performance, but with RT added, at a lower product tier" when compared to Pascal - of course with the top end moving upwards in both price and performance. This, on the other hand, looks like an excellent value play, and finally a significant improvement in perf/$ from day 1, and even comparing with Pascal. Of course there are reasons for this such as more expensive process nodes, more expensive memory, more complex PCBs and more complex coolers, but overall GPU pricing per market segment has seen a significant increase in later years. Mainstream GPUs used to be around (and often below) $200, while the most heavily marketed GPUs are typically above $300, with anything lower treated as a sort of second-class citizen. The selection below $200 is downright awful, even with the slightly better value 1650S on the market. I'm hoping for some downward price creep this generation around - with these efficiency improvements there should be plenty of room for small, cheap chips with good performance.
 
Joined
Apr 12, 2013
Messages
7,529 (1.77/day)
Best sell now
Would have been apt about a week or two back. After today's shellacking the only people buying used 2080Ti at anything above $300-400 are either just waking up from a Coma/hibernation or those living in a complete bubble from the outside world & somehow having the urge to spend big bucks for what is now an obsolete card :nutkick:
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
even the 3070 has more than the 2080ti,
Much more, yet it is roughly matching it on performance, curious isn't it?
When in the past gens, perf/CU figures were rising.

As if someone just decided to double the claimed figure just for marketing lulz.

It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

3080 25% faster than 2080 Ti.
3090 45 faster than 2080 Ti.
What is the source for those claims?

even outselling some of AMD's mid-range cards.
You trust (and mistread) steam hardware survey too much.
For actual sales check reports from actual shops, e.g. mindfactory.
 
Joined
Aug 11, 2020
Messages
245 (0.16/day)
Location
2nd Earth
Processor Ryzen 5700X
Motherboard Gigabyte AX-370 Gaming 5, BIOS F51h
Cooling MSI Core Frozr L
Memory 32GB 3200MHz CL16
Video Card(s) MSI GTX 1080 Ti Trio
Storage Crucial MX300 525GB + Samsung 970 Evo 1TB + 3TB 7.2k + 4TB 5.4k
Display(s) LG 34UC99 3440x1440 75Hz + LG 24MP88HM
Case Phanteks Enthoo Evolv ATX TG Galaxy Silver
Audio Device(s) Edifier XM6PF 2.1
Power Supply EVGA Supernova 750 G3
Mouse Steelseries Rival 3
Keyboard Razer Blackwidow Lite Stormtrooper Edition
I think I will be happy with 3070, significantly less power than 3080, but still a good performer for 3440x1440 75Hz..
I'll wait for RDNA2. I hope AMD can deliver at least 3070 performance for less power and lower price.
 
Top