Monday, July 15th 2024

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.

The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
Sources: @Orlak29_ on X, via VideoCardz
Add your own comment

168 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

#1
qwerty_lesh
just reiterating the sentiment from the earlier discussion on the connector - RIP to all of the ATX 3.0 buyers out there.
Posted on Reply
#2
LabRat 891
500W + Transient Peaks, over the 12V-2x6? :roll:

Looks like Intel is setting precedent:
It's okay to engineer products that will fail inside warranty.
Posted on Reply
#3
Hyderz
any ball park on performance compared to 40 series?

5090 - top
5080 - up to 10% faster than 4090
5070 - 5% slower 4080/S ?
5060 - in between 4070 and 4070 super?
5050 - 1-3% faster than 4060?
Posted on Reply
#4
ratirt
Something tells me, this NV cards gen will not have staggering performance increase over the 4000 series.
We will see when the cards are released but that is my guess.
Posted on Reply
#5
Legacy-ZA
More interested to see if they stopped sniffing glue; meaning, if this generation will be affordable and have more than enough VRAM.
Posted on Reply
#6
Broken Processor
Legacy-ZAMore interested to see if they stopped sniffing glue; meaning, if this generation will be affordable and have more than enough VRAM.
I don't see them reducing pricing while AI demand is so strong and that will depend on ASIC being convincing enough to justify moving from Nvidia software which is currently the best and most used for AI.
Posted on Reply
#7
64K
The 5060 power requirement increase over the 4060 makes sense because the 4060 wasn't really a xx60 class GPU to begin with in the normal generation stack. The 4060 was really a xx50 class GPU and labeled a 4060 to be able to overcharge customers. imo the entire Ada stack was overpriced except maybe the 4090 because it was the gaming flagship and you always pay a premium for that.

I, as always, look forward to the next generation of GPUs and I hope the pricing for the Blackwells makes more sense than Ada did but we'll see.
Posted on Reply
#8
AusWolf
ratirtSomething tells me, this NV cards gen will not have staggering performance increase over the 4000 series.
We will see when the cards are released but that is my guess.
Performance increase maybe, but performance-per-watt increase definitely not. Otherwise, they wouldn't have to bump the wattage on each tier by this much.
Posted on Reply
#9
R0H1T
It will be better (perf/w) because of GDDR7, though the biggest difference would come from better "IPC" if any!
Posted on Reply
#10
Legacy-ZA
Broken ProcessorI don't see them reducing pricing while AI demand is so strong and that will depend on ASIC being convincing enough to justify moving from Nvidia software which is currently the best and most used for AI.
They have shipped far less GPU's this generation than any other over a decade.

The excuse is A.I demand, but in reality, people are voting with their wallets. They are artificially keeping the prices inflated, and it's going to blow up in their faces.

Gamers also understood, that the whole product stack of the 4000 series, except for the 4090, was bumped up a tier masquerading as something they are not, paying more, while frame gen had to do the lifting.

Gamers in general aren't as stupid as most think. nVidia will have to change strategies. I like their cards more than AMD, but only because older games are less of a hassle to configure to work.

AMD is starting to look mighty fine, especially if they can pull off a 7900GRE next launch at the price it's at now.
Posted on Reply
#11
Chaitanya
With GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).
Posted on Reply
#12
AusWolf
ChaitanyaWith GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).
Ever since Ampere, x90 cards are the new SLi.
Posted on Reply
#13
ratirt
AusWolfPerformance increase maybe, but performance-per-watt increase definitely not. Otherwise, they wouldn't have to bump the wattage on each tier by this much.
I'm talking about the staggering perf increase not a performance increase in the expense of power usage.
I speak in general to be honest but I agree. If there is a performance increase it will be at the expense of power but still the increase is not going to be substantial. The increase in price. Oh, that is a different story and this will be staggering. It's just my guess. It is what I think will happen.
Posted on Reply
#14
64K
ChaitanyaWith GPU along sucking 500W, this time around wont be surprised to find 1kW PSU being bare minimum for High end WS builds(with single GPU).
Not just that but what a card can spike to. The MSI 4090 Gaming X reviewed here on TPU while gaming drew 430 watts but spiked to 660 watts. Insane.
Posted on Reply
#15
AusWolf
Speaking of power, why does a 100 W card need a 16-pin connector? :kookoo: :shadedshu:
Posted on Reply
#16
ratirt
AusWolfSpeaking of power, why does a 100 W card need a 16-pin connector? :kookoo: :shadedshu:
There is one standard, the 16 pin connector. The card does not have to use it fully though. I don't think that is a problem. It will just use what it needs to sustain proper functionality.
Posted on Reply
#17
AusWolf
ratirtThere is one standard, the 16 pin connector. The card does not have to use it fully though. I don't think that is a problem. It will just use what it needs to sustain proper functionality.
Every PSU has at least an oldschool 6-pin, which is more than enough. A 16-pin requires a messy adaptor or a new PSU, and it's therefore pointless and unwanted on a low-power card.
Posted on Reply
#18
londiste
64KNot just that but what a card can spike to. The MSI 4090 Gaming X reviewed here on TPU while gaming drew 430 watts but spiked to 660 watts. Insane.
That is just MSI doing bad VRM design.
Spikes for power delivery are normal, the short 10-20ms spikes that sites including TPU are measuring these days normally do go a good 30% over average for a decent design. And that is OK.
AusWolfEvery PSU has at least an oldschool 6-pin, which is more than enough. A 16-pin requires a messy adaptor or a new PSU, and it's therefore pointless and unwanted on a low-power card.
It is simply about consolidation of standards. 6-pin is more than enough for a low-power card. It is not enough for midrange where you would need an 8-pin. And higher end needs 2-3 of those...
Yes, the 16-pin has all the sense stuff and there is - or should be - limits based on what PSU can provide but that is still a more elegant solution.
Posted on Reply
#19
Bwaze
ratirtSomething tells me, this NV cards gen will not have staggering performance increase over the 4000 series.
We will see when the cards are released but that is my guess.
Something tells me it won't matter - due to explosion in AI demand and revenue I believe we will see a release mostly focussed on how these cards can be used for machine learning, home Neural acceleration etc., we will see focus on all the applications AI can and could some day perform, and so Nvidia will even rename the cards from gaming to something that should encompass gaming + neural acceleration, and "Gaming" sector won't be called gaming any more - and they will show even Gaming itself needs AI acceleration now, for all the smart antialiasing / resizing to accelerating NPCs, speech recognition and generation etc...

And they will provide proof - all the extra "Gaming" revenue that is pushing Gaming to record heights is coming from orders for AI related acceleration now.

So the tiers, pricing, everything is open to total change now that the market's changed.
Posted on Reply
#20
AusWolf
BwazeSomething tells me it won't matter - due to explosion in AI demand and revenue I believe we will see a release mostly focussed on how these cards can be used for machine learning, home Neural acceleration etc., we will see focus on all the applications AI can and could some day perform, and so Nvidia will even rename the cards from gaming to something that should encompass gaming + neural acceleration, and "Gaming" sector won't be called gaming any more - and they will show even Gaming itself needs AI acceleration now, for all the smart antialiasing / resizing to accelerating NPCs, speech recognition and generation etc...

And they will provide proof - all the extra "Gaming" revenue that is pushing Gaming to record heights is coming from orders for AI related acceleration now.

So the tiers, pricing, everything is open to total change now that the market's changed.
I put a like on this thought because there's an element of truth in it - and not because I actually like it.

Personally, I'd much rather see gaming and AI take separate paths, but I don't think it's gonna happen.
londisteIt is simply about consolidation of standards. 6-pin is more than enough for a low-power card. It is not enough for midrange where you would need an 8-pin. And higher end needs 2-3 of those...
Yes, the 16-pin has all the sense stuff and there is - or should be - limits based on what PSU can provide but that is still a more elegant solution.
It seems more like a diversification of standards to me. If you want AMD, your old PSU with 8-pin power is fine, but if you want Nvidia, even a low power model, you should get a new PSU, or use an ugly adapter. Why? What's elegant about this?
Posted on Reply
#21
wolf
Better Than Native
How do they perform and what will they cost, these power figures don't phase me at all, nor does the use of power connector - I find that to be an enormous nothing burger.
Posted on Reply
#22
oxrufiioxo
AusWolfEvery PSU has at least an oldschool 6-pin, which is more than enough. A 16-pin requires a messy adaptor or a new PSU, and it's therefore pointless and unwanted on a low-power card.
You can easily get a 2x8 pin to 16 pin adapter for almost every major psu maker. None of my power supplies have a native 12VHPWR socket yet all of them down to 750w seasonic GX works fine with my 4090 and all have a 2x8 to 1x16
Posted on Reply
#23
ARF
ratirtThere is one standard, the 16 pin connector.
Since when? Have you asked AMD? And how many Radeons and intel Arcs exactly use this new low quality power connector?
londisteIt is simply about consolidation of standards.
Wrong.
londiste6-pin is more than enough for a low-power card. It is not enough for midrange where you would need an 8-pin. And higher end needs 2-3 of those...
6-pin and 8-pin will remain the one standard, while the stupid nvidia will push for the low quality cost-saving, melting and hazardous "16-pin" which can never transfer safely more than 200 watts..
Posted on Reply
#24
oxrufiioxo
wolfHow do they perform and what will they cost, these power figures don't phase me at all, nor does the use of power connector - I find that to be an enormous nothing burger.
The process node isn't a huge leap so the majority of any gains will have to come from larger die size, higher IPC, and clock/ram speed.

My guess is the 5090 will be 50 to 60% faster than the 4090 but even more expensive with RT being a bit higher. The 5080 matching the 4090 or slightly exceeding it 10% ish for 1200 usd does anything lower really matter not to me lol.

Oh maybe a 700-800 usd 4080 matching 5070 with 12GB of vram lmao cuz nvidia gonna be nvidia with their fanbois defending it drunk on that green Kool-aid.
Posted on Reply
#25
AusWolf
oxrufiioxoYou can easily get a 2x8 pin to 16 pin adapter for almost every major psu maker. None of my power supplies have a native 12VHPWR or atx 3.0 socket yet all of them down to 750w seasonic GX works fine with my 4090 and all have a 2x8 to 1x16
But why should I use a bulky 2x8-pin adaptor when a single 6-pin cable would do the job just fine?
Posted on Reply
Add your own comment
Nov 28th, 2024 21:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts