• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces GeForce Ampere RTX 3000 Series Graphics Cards: Over 10000 CUDA Cores

I know Nvidia recommends a 750W PSU for the 3080 but I'm hoping my 650W Gold rated PSU will suffice, reviews/time will tell.
Depends on the rest of your hardware and how noisy your PSU is, but 650W with a 320W GPU should be perfectly fine unless you're running very high end hardware or overclocked. If Ampere is power limited to TDP like Turing, you have 320W for the GPU, plus 20-30 for the motherboard and RAM, ~5 per SSD, 15 per 3.5" HDD, ~5W per couple of fans, 10 per AIO pump, and however much power your CPU needs. For a 65W AMD that is 88W, for a 95W AMD it's 144W, for a 10th gen Intel it's 150-250W depending on the SKU. That's at stock clocks within the boost window (which might be infinite depending on your motherboard). I would add 20% margin on top of that for safety, and at least another 20% if you're overclocking - likely more. Of course it's highly unlikely for all components in the system to draw maximum power at once, and CPUs pretty much never run at 100% while gaming, so there's some extra margin in there based on that too. 650W would as such be rather slim for a system with something like a 10700K or 10900K (my formula ends up at 675W minimum assuming a couple of SSDs and a few fans), but should work fine with a less power hungry CPU or if you undervolt and/or tune the power limits of one of those power hogs.
 
Their test system was with an i9 10900k. Nvidia seems to be confident that PCIE 3.0 is not a bottleneck. Jensen, what happened? No love for your niece?
 
Their test system was with an i9 10900k. Nvidia seems to be confident that PCIE 3.0 is not a bottleneck. Jensen, what happened? No love for your niece?

At 4K there really is no difference between Pcie 3.0 or 4.0
And Nvidia tested them with a 9900K, still doesn't make any difference against 10900K
 
surprisingly good pricing from them; i can compare this with a "preemptive strike" in anticipation of amd release...

what is the best part - we, end users , not fanboys, will have good prices from both of them
 
The RTX 3070 is going to sell like hot tamales.
If Nvidia can satisfy demand at that $499 price point, then I'll take back every time I've ever complained about their Sheriff of Nottingham business strategy.
 
The power usage is crazy. Both the 90 and 80 are above 300W and even the 70 is approaching the territory that used to be reserved for the 80 Ti cards. But then if 3070 is actually faster than 2080 Ti by a good margin, that means power efficiency has been improved with Ampere. BTW where is that 12-pin pcie aux power connector?

I agree the power requirements have gone through the roof. Waiting for official reviews to see how much performance improvement we are getting with this generation. Also I am not convinced the CUDA core count is correct, i.e. the actual physical number of cores may be half of what is advertised.
 
I wonder if this generation will be majorly power limited, where unlocking TDP will actually have a measurable effect.
 
Huh :wtf: not this again :shadedshu:
In some places around the world there is plently of love for the niece, along with the sister and 1st cousin.
 
once again , nVidia 's magic in the works !!!
Ultra-Hyped from what was announced by Jensen:love:.
 
This is defiantly going to make it tough to actually give RDNA2 consideration from the looks of things though we still don't know how that will be comparatively speaking. It's probably still a bit premature to call this a grand slam by Nvidia, but that pricing is aggressive this time around and exactly what's needed to propel RTRT forward. I am keen to see just how competitive AMD's card's stack up and at what price point structure. I could see this putting a big damper on Intel's GPU ambitions too.
 
They've left AMD an open goal because they're using Samsung's clearly inferior 8nm process node. TSMC 7nm enhanced RDNA2 with more memory and lower power draw will beat out the 3080 but will lose in RT quite handedly.

I wonder what relative performance means this time. I have a gut feeling that this incredible (roughly ~1.7x compared to 2080S looking at the graph) speed-up it's all about raytracing and not so much for rasterization but I would love to be wrong here.

It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

3080 25% faster than 2080 Ti.
3090 45 faster than 2080 Ti.
 
The RTX 3070 is going to sell like hot tamales.
If Nvidia can satisfy demand at that $499 price point, then I'll take back every time I've ever complained about their Sheriff of Nottingham business strategy.

I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.

I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.

Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.

They've left AMD an open goal because they're using Samsung's clearly inferior 8nm process node. TSMC 7nm enhanced RDNA2 with more memory and lower power draw will beat out the 3080 but will lose in RT quite handedly.



It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

'the' leaks? The 12 pin was the only truly accurate one man (alright, and the pictures then). Nvidia played this well, you can rest assured all we got was carefully orchestrated. And that includes the teasing of a 12 pin. Marketing gets a lead start with these leaks, we also heard 1400- 2000 dollars worth of GPU, obviously this makes the announcement of the actual pricing even stronger.
 
Last edited:
I thought the 220 watts for the 3070 wasn’t a bad TDP for the performance their advertising.
I think it's a bit much for a 70-class card, but the real problem is the TDP of 3080 and 3090. I think >300W is too much to cool at a reasonable noise level.
 
I think it's a bit much for a 70-class card, but the real problem is the TDP of 3080 and 3090. I think >300W is too much to cool at a reasonable noise level.

It is a bit much, basically its a full 104 die's overclocked power consumption. And it is a full 104 as well isn't it? This means that the actual SKUs are still doing what they did, and Nvidia is maintaining its stack in a broad sense. It's just that the 3080 and 3090 have an odd gap for being off the same die, clearly a yield based decision... Its clear GA102 isn't directly a fantastic place to be, if you ask me, there might be some distinct binning differences there.
 
Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070 +40%
2080S to 3080 +64%

and from Techpowerups 4K benchmarks;

1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!
 
It is a bit much, basically its a full 104 die's overclocked power consumption. And it is a full 104 as well isn't it? This means that the actual SKUs are still doing what they did, and Nvidia is maintaining its stack in a broad sense. It's just that the 3080 and 3090 have an odd gap for being off the same die, clearly a yield based decision... Its clear GA102 isn't directly a fantastic place to be, if you ask me, there might be some distinct binning differences there.
I don't care about which chip they use in which tier, that has changed in pretty much each generation, what matters is how it performs and how much energy it consumes.

I think 220W is a bit much, but tolerable for GTX 3070, but there is a substantial jump up to 320W for RTX 3080, which I think is too hot.

The performance and price gap between RTX 3080 and RTX 3090 probably indicates the production volume of RTX 3090. GTX 1080 Ti and RTX 2080 Ti have been big sellers, even outselling some of AMD's mid-range cards. Time will tell if RTX 3090 will be scarce.
 
Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070 +40%
2080S to 3080 +64%

and from Techpowerups 4K benchmarks https://www.techpowerup.com/review/asus-radeon-rx-5700-xt-tuf-evo/28.html;

1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!

That's why I skipped Turing. SUPER was too late to the party.
 
That's why I skipped Turing. SUPER was too late to the party.

Yes but we are comparing the old to the new generations, unless the super cards were a generational change?

Ok lets just get real here and come back to earth everyone and lets do the numbers, from NVidia's optimistic 4K benchmarks (RTX off) in the presentation;

2070S to 3070 +40%
2080S to 3080 +64%

and from Techpowerups 4K benchmarks;

1070 to 2070S +66%
1080 to 2080S +60%

The 1000 to 2000 super series gave us a bigger increase than these new cards!!!

and for AMD benchmarks at 4K;

RX580 to RX5700XT +100%

and to the future Big Navi (RDNA 2) at 4K;

RX5700XT to RX6900XT +120%???
 
Last edited:
Ah yes, the fabled "Big Navi", it does appear to be the second coming
Soundly beating 3070 doesn't sound to me as the second coming... at all?
It could explain NV's suddenly being so modest about pricnig, it's such a great contrast to Turing prices, with 2080Ti never being offered at the claimed $999 MSRP.
 
I've always been a little bit puzzled by the supposed 'price gouging' Nvidia is doing. Yes, they're leading and command a bit of premium. But there's almost always something on offer for that premium. And then there's always a bunch of GPUs below it that do get some sort of advancement in perf/dollar and absolute performance.

I mean... the 970 was super competitive also on price. The 660ti was the same back during Kepler and the 670 was seen as the 'poor man's 680', but performed virtually the same. The 1070 was dropping the 980ti price point down by a few hundred... and its happening again with x70 today. The price of an x70 has risen... but so has the featureset and the performance gap to the bottom end.

Even with the mining craze the midrange was populated and the price, while inflated, was not quite as volatile as others.
There's no doubt that per-tier pricing has made some major jumps in recent years. Have we gotten more performance at that tier? Sure! But perf/$ has barely been moving at all, at least until RDNA showed up and Nvidia launched the Supers. Turing was essentially "pay the same for the same level of rasterization performance, but with RT added, at a lower product tier" when compared to Pascal - of course with the top end moving upwards in both price and performance. This, on the other hand, looks like an excellent value play, and finally a significant improvement in perf/$ from day 1, and even comparing with Pascal. Of course there are reasons for this such as more expensive process nodes, more expensive memory, more complex PCBs and more complex coolers, but overall GPU pricing per market segment has seen a significant increase in later years. Mainstream GPUs used to be around (and often below) $200, while the most heavily marketed GPUs are typically above $300, with anything lower treated as a sort of second-class citizen. The selection below $200 is downright awful, even with the slightly better value 1650S on the market. I'm hoping for some downward price creep this generation around - with these efficiency improvements there should be plenty of room for small, cheap chips with good performance.
 
Best sell now
Would have been apt about a week or two back. After today's shellacking the only people buying used 2080Ti at anything above $300-400 are either just waking up from a Coma/hibernation or those living in a complete bubble from the outside world & somehow having the urge to spend big bucks for what is now an obsolete card :nutkick:
 
even the 3070 has more than the 2080ti,
Much more, yet it is roughly matching it on performance, curious isn't it?
When in the past gens, perf/CU figures were rising.

As if someone just decided to double the claimed figure just for marketing lulz.

It's not rasterization, people being bamboozled by Nvidia marketing as expected. The rasterization perf is exactly as the leaks rumoured:

3080 25% faster than 2080 Ti.
3090 45 faster than 2080 Ti.
What is the source for those claims?

even outselling some of AMD's mid-range cards.
You trust (and mistread) steam hardware survey too much.
For actual sales check reports from actual shops, e.g. mindfactory.
 
I think I will be happy with 3070, significantly less power than 3080, but still a good performer for 3440x1440 75Hz..
I'll wait for RDNA2. I hope AMD can deliver at least 3070 performance for less power and lower price.
 
Back
Top