Monday, July 15th 2024

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.

The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
Sources: @Orlak29_ on X, via VideoCardz
Add your own comment

168 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

#101
JustBenching
oxrufiioxoIt's a 10% ish increase likely due to them wanting to push it a bit since the node isn't significantly better.

Ada can be really power tuned with almost no loss in performance guessing it'll be the same with this.

My 4090 at 350w loses almost no performance definitely not enough to be noticed without using an overlay at 320w it's less than 5%.

Guessing the 5090 is going to be some monstrosity that is borderline HPC with a price to match.
I'm running mine on 320w with some oced memory, it's 2-3% faster than stock, lol :D

Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?
Posted on Reply
#102
Neo_Morpheus
Launcestonian500W TDP for RX 5090! yikes... & I thought my RX 7900 XTX with 350W TDP was overkill. When gaming at 3440x1440p, ultra settings in Starfield, It does spike to 510W, but avgs 350w during a session.
ARFOnly the hardcore nvidia fans will bear this utter engineering junk. Will have 800-watt spikes, and will require 1200-watt PSUs, especially when coupled with a 300-watt intel oven. :kookoo:
The funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption, but magically that is not an issue with Ngreedia in this case. Granted, these are rumors, but still.

Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.

Example the 7900 XTX should’ve been 7800 XT and priced at 799.

Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.
Posted on Reply
#103
oxrufiioxo
fevgatosI'm running mine on 320w with some oced memory, it's 2-3% faster than stock, lol :D

Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?
Not every sample will do that though but I'd make a bet almost every 4090 can be dropped to 350w with almost no loss in performance.

Mine boost to almost 2900mhz stock and stays there is my guess why there is some slight performance drop at 350w it sticks around 2600mhz
Posted on Reply
#104
phints
Since 4070 is 220W TDP and 4070 Super is 225W TDP, keeping the 5070 at 220W TDP sounds good. It'll be a solid performance/watt increase from the new architecture and lithography bump.
Posted on Reply
#105
ratirt
TumbleGeorgeI need of much more (real)increase for to be staggered. Please, no more fake teraflops!
I doubt there will be any of this kind.
Rather perf increase at the expense of power consumption. Really, don't holding my breath.
Posted on Reply
#106
TumbleGeorge
phints4070 Super is 225W TDP, keeping the 5070 at 220W TDP sounds good. It'll be a solid performance/watt increase
How with smaller number of cores and smaller L3 cache?
Posted on Reply
#107
JustBenching
Neo_MorpheusThe funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption,
No, not because of power consumption, because of efficiency. These 2 are different. Put a 7900xtx and a 4080 both at 250w and see which one is faster. That's efficiency. Power draw is irrelevant, a card can draw 2 kilowatts and its fine, if you can limit it to 300w and have it still be fast, no issue.
oxrufiioxoNot every sample will do that though but I'd make a bet almost every 4090 can be dropped to 350w with almost no loss in performance.

Mine boost to almost 2900mhz stock and stays there is my guess why there is some slight performance drop at 350w it sticks around 2600mhz
The card is memory limited so yeah, the core dropping clocks doesn't impact performance. Just clock your memory a bit and it should be faster than stock. I'm running +1400 on mine
Posted on Reply
#108
oxrufiioxo
fevgatosThe card is memory limited so yeah, the core dropping clocks doesn't impact performance. Just clock your memory a bit and it should be faster than stock. I'm running +1400 on mine
Mine is OC to +1200 which is measurable in benchmarks like timespy but in actual games it makes about 1-2% difference when logging over an hour for me clockspeed matters more I see the most gains at 3ghz but not running my card locked to 600w lmao....

It does flip back and fourth depending on game played though and if I'm running path tracing or pure rasterization.
Posted on Reply
#109
TheDeeGee
MxPhenom 216He doesnt know either. Theres nothing really keeping a user of atx 3.0 PSU using these new cards. The connector, even tho its not the newer revision adopted with atx 3.1 will still work.
Thought as much, as the female connector hasn't changed and is compatible with both revisions.
Posted on Reply
#110
AusWolf
fevgatosTry horizon zero dawn. Yeah the old one. Medium just destroyes the graphics, it removes all shadows etc. I used fsr high instead of native medium on my laptop, looked way better
It's from 2017. I highly doubt you need medium graphics with a 7900 XT in it to run properly.
oxrufiioxoIt's a 10% ish increase likely due to them wanting to push it a bit since the node isn't significantly better.

Ada can be really power tuned with almost no loss in performance guessing it'll be the same with this.

My 4090 at 350w loses almost no performance definitely not enough to be noticed without using an overlay at 320w it's less than 5%.

Guessing the 5090 is going to be some monstrosity that is borderline HPC with a price to match.
It's 10% now, but let's not forget that power requirements have been steadily increasing ever since Turing. The GTX 1060 had a TDP of 120 W. That's x50 card territory now.
fevgatosI'm running mine on 320w with some oced memory, it's 2-3% faster than stock, lol :D

Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?
Because that's what you see before you boot into windows, install your tools and apply your own power limit, maybe?
Posted on Reply
#111
ARF
Neo_MorpheusThe funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption, but magically that is not an issue with Ngreedia in this case. Granted, these are rumors, but still.

Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.

Example the 7900 XTX should’ve been 7800 XT and priced at 799.

Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.
I have been saying this about the names for years. AMD never hears us.
But it's definitely not only their fault. It's the whole market against them. When we desperately need their healthy competition.
Posted on Reply
#112
Neo_Morpheus
ARFWhen we desperately need their healthy competition.
But thats the thing, i do think that the 7000 series is competitive up to the 4080.

The “problems “ at least for gamers, is the gimmicks, like fsr, dlss and RT.

FSR and dlss plus fake frames are there to cheat over native rendering and hide their perhaps limited performance at the given resolutions.

And RT hype/push is simply the influencers (formerly known as reviewers) earning their free 4090s. We have less than 5 games that properly use RT yet we are made believe otherwise.
Posted on Reply
#113
ARF
Neo_MorpheusBut thats the thing, i do think that the 7000 series is competitive up to the 4080.
I don't think that RX 6400, RX 6500 XT and RX 7600 are exactly competitive. The first two are junk leftover from some hybrid laptop systems, which do need a CPU + dGPU, in order to use the CPU's media engine as the default video accelerator, while the latter is too weak - it's a 6nm chip with 0% performance improvement over the older RX 6600/6650 XT.

AMD doesn't innovate, it simply follows the nvidia leadership. This is in fact duopoly, with all the legal consequences that arise from this.
Neo_MorpheusThe “problems “ at least for gamers, is the gimmicks, like fsr, dlss and RT.

FSR and dlss plus fake frames are there to cheat over native rendering and hide their perhaps limited performance at the given resolutions.

And RT is simply the influencers (formerly known as reviewers) earning their free 4090s. We have less than 5 games that properly use RT yet we are made believe otherwise.
I don't listen to their marketing BS. I use the classic approach - use low and medium setting where needed to achieve high enough FPS.
Posted on Reply
#114
oxrufiioxo
AusWolfIt's from 2017. I highly doubt you need medium graphics with a 7900 XT in it to run properly.


It's 10% now, but let's not forget that power requirements have been steadily increasing ever since Turing. The GTX 1060 had a TDP of 120 W. That's x50 card territory now.


Because that's what you see before you boot into windows, install your tools and apply your own power limit, maybe?
As the gains from the process node diminish that is just the reality the days are gone of both getting a massive performance increase at lower power for people who actually want progress when it comes to performance this is just the reality.... I am sure whatever the 5060/5070 end up being anyone who cares will be able to slim down the power target quite a bit.

The 5090 will likely be 80-100% faster than the 3090ti at the same or lower power though so anyone who want's to pay for it can still get massive performance per watt improvements.
Posted on Reply
#115
Neo_Morpheus
ARFI don't think that RX 6400, RX 6500 XT and RX 7600 are exactly competitive. The first two are junk leftover from some hybrid laptop systems, which do need a CPU + dGPU, in order to use the CPU's media engine as the default video accelerator, while the latter is too weak - it's a 6nm chip with 0% performance improvement over the older RX 6600/6650 XT.
My bad, i do agree with that. Should have said 7800 xt and the infinite variations of the 7900. :)
ARFAMD doesn't innovate, it simply follows the nvidia leadership. This is in fact duopoly, with all the legal consequences that arise from this.
Well, depending in what you call innovations, but at the same time, i’m still willing to give them a bit more time to get the Radeon group up to the same level as the cpu group.

People forget that less than 10 years ago, the company was almost dead (thanks in huge part to intel dirty and illegal actions) so they bet everything on Ryzen and are now building up Radeon and other stuff.

You might say cuda but remember that amd bet on opencl and everyone bailed, ngreedia sabotaged it in favor of cuda, etc.

So it does takes time and they simply need a bit more.
ARFI don't listen to their marketing BS. I use the classic approach - use low and medium setting where needed to achieve high enough FPS.
Sadly, you and I are a minority in that train of thought
Posted on Reply
#116
AusWolf
ARFI don't think that RX 6400, RX 6500 XT and RX 7600 are exactly competitive. The first two are junk leftover from some hybrid laptop systems, which do need a CPU + dGPU, in order to use the CPU's media engine as the default video accelerator, while the latter is too weak - it's a 6nm chip with 0% performance improvement over the older RX 6600/6650 XT.
At least it's a fair bit cheaper than the equally bad 4060. I'm not saying it's great, I'm just trying to find some positives, whatever little there is.
Posted on Reply
#117
Vayra86
oxrufiioxoAs the gains from the process node diminish that is just the reality the days are gone of both getting a massive performance increase at lower power for people who actually want progress when it comes to performance this is just the reality.... I am sure whatever the 5060/5070 end up being anyone who cares will be able to slim down the power target quite a bit.

The 5090 will likely be 80-100% faster than the 3090ti at the same or lower power though so anyone who want's to pay for it can still get massive performance per watt improvements.
I wouldnt be too quick to say that. AMD isnt really trying so we only have Nvidia ergo Huangs blue eyes to believe this. Nvidia is simply in a position to say this right now. And look how it works for their margins.

Meanwhile, on CPU, where a real competitor exists and both are REALLY trying to make the best cpus, we see lots of progress irrespective of the node. From chiplet, to interconnects, big little, X3D... and then we see that power really doesnt have to keep going up. Except when a design isnt quite a fix, such as Intel's E cores, do we see how far the powerbudget needs to go to remain competitive.

And hey, look at GPU. Even upscale could be considered such a new tech. And it does change the playing field. Too bad Nvidia enforces a VRAM/bandwidth product limitation and segmentation plus an RT push to make you believe otherwise.
Posted on Reply
#118
oxrufiioxo
Vayra86I wouldnt be too quick to say that. AMD isnt really trying so we only have Nvidia ergo Huangs blue eyes to believe this. Nvidia is simply in a position to say this right now. And look how it works for their margins.

Meanwhile, on CPU, where a real competitor exists and both are REALLY trying to make the best cpus, we see lots of progress irrespective of the node. From chiplet, to interconnects, big little, X3D... and then we see that power really doesnt have to keep going up.
Yeah but gaming specifically it's slowing down quite a bit a lot of that has to do with developers for sure but we are seeing 5-10 ish per year improvements while getting 50%+ at the top for gpus generation every two years.

Amd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.....

MT performance in general has been nice an 3Dvcache keeps amd competitive with Intel in gaming but intel hasn't shipped a new desktop arch since 2022 and amd is barely faster depending on the gaming suite benchmarked not really impressive to me

To me it feels like decent pricing after launch ofc has been the only real winner with cpus at launch they've all been overpriced as well.

It's really just the 500 and under market that has really gone to shite with graphic cards and pricing in general not actual improvements at the top generation after generation.
Posted on Reply
#119
Launcestonian
Neo_MorpheusThe funny thing is, everyone calls the whole Radeon 7000 series trash because of power consumption, but magically that is not an issue with Ngreedia in this case. Granted, these are rumors, but still.

Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.

Example the 7900 XTX should’ve been 7800 XT and priced at 799.

Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.
Then there is the fact that a lot of RX 7900 XTX are factory overclocked as well & even then, in the majority of cases even more OC can be found with little if any extra power draw making them even more value for money.
Posted on Reply
#120
R0H1T
oxrufiioxoAmd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.
Not really no for desktops I'd still say the biggest issue is bandwidth & till that is addressed, either with 4 channel MC or something else, the practical limits at teh top end will remain roughly the same. Zen is still a pretty lean core so they can go even wider & then they will need to make it work with even higher memory speeds/wider memory interface at the lowest end which includes desktops.
chipsandcheese.com/2024/07/09/qualcomms-oryon-core-a-long-time-in-the-making/
Posted on Reply
#121
oxrufiioxo
R0H1TNot really no for desktops I'd still say the biggest issue is bandwidth & till that is addressed, either with 4 channel MC or something else, the practical limits at teh top end will remain roughly the same. Zen is still a pretty lean core so they can go even wider & then they will need to make it work with even higher memory speeds/wider memory interface at the lowest end which includes desktops.
chipsandcheese.com/2024/07/09/qualcomms-oryon-core-a-long-time-in-the-making/
While I am not going to sit here and say 16 cores isn't enough for any mainstream desktop platform it is..... the R7 and R5 has really stagnated in both core counts and MT performance I was shocked at how bad a 7800X3D was at MT it honestly didn't feel all that different and in some cases worse than my 5800X outside of gaming.

Honestly it was so bad outside of gaming the headache of the 7950X3D become very appealing.

Hopefully 9000 or 11000 fixes that but I guess until Arrow Lake is shown it's hard to know, it looks like it will also not see very impressive MT boost.... But if the 9700X loses to a 14700k in MT that is pretty embarrassing considering how old the Raptor lake core is if they still want to price it like an i7.... Now if AMD shifts down pricing to say $329 sure then its fine
Posted on Reply
#122
AusWolf
oxrufiioxoYeah but gaming specifically it's slowing down quite a bit a lot of that has to do with developers for sure but we are seeing 5-10 ish per year improvements while getting 50%+ at the top for gpus generation every two years.

Amd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.....

MT performance in general has been nice an 3Dvcache keeps amd competitive with Intel in gaming but intel hasn't shipped a new desktop arch since 2022 and amd is barely faster depending on the gaming suite benchmarked not really impressive to me

To me it feels like decent pricing after launch ofc has been the only real winner with cpus at launch they've all been overpriced as well.

It's really just the 500 and under market that has really gone to shite with graphic cards and pricing in general not actual improvements at the top generation after generation.
I guess there's also the fact that one doesn't need more than 8 cores in a mainstream desktop, just like one doesn't need more than a 6700 XT for 1440p and below with sensible graphics settings and FPS expectations. The enthusiast range expanded downwards to the x80, or even x70 level while game hardware requirements haven't increased rapidly like we saw in the '90s and early 2000s.
Posted on Reply
#123
oxrufiioxo
AusWolfI guess there's also the fact that one doesn't need more than 8 cores in a mainstream desktop, just like one doesn't need more than a 6700 XT for 1440p and below with sensible graphics settings and FPS expectations. The enthusiast range expanded downwards to the x80, or even x70 level while game hardware requirements haven't increased rapidly like we saw in the '90s and early 2000s.
Same with what Nvidia has done with Vram on the lower tiers I hate seeing stagnation in any form.... The low end matters just as much if not more than the high end and without year on year improvements we will see stagnation kinda like the half decade plus of mainstream quad cores, but beyond something like Hellblade 2 we might be hitting a limit of what rasterization can do and only true path tracing or something that hasn't been invented yet will actually lead to any meaningful improvements going forward.

At the very least I think both you and me can agree they really need to come up with something hardware agnostic that is much better than TAA at the same performance hit lol.....
Posted on Reply
#124
Visible Noise
Legacy-ZAMore interested to see if they stopped sniffing glue; meaning, if this generation will be affordable and have more than enough VRAM.
Have you seen their revenue growth? I’ll take some of that glue please!
ChaitanyaWith GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).
meh. I’m already using 1300w server supplies. Don’t cheap out on power if you want a stable system.
ARFI wouldn't be so rude to point the finger onto normal people who value the better product.
If slow, hot and lacking features is how you define the better product maybe it’s best to have a finger pointed at you. Unless you think Intel makes better CPUs than AMD?
Posted on Reply
#125
wolf
Better Than Native
Visible NoiseIf slow, hot and lacking features is how you define the better product maybe its best to have a finger pointed at you.
I have no idea how they came to the conclusion the 7900GRE is the better product, but I know many vocal users cite reasons beyond the actual objective specs and facets of the products themselves, such is tribalism and a false sense of moral high ground.
Posted on Reply
Add your own comment
Nov 28th, 2024 23:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts