Monday, July 15th 2024

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.

The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
Sources: @Orlak29_ on X, via VideoCardz
Add your own comment

168 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

#151
Neo_Morpheus
fevgatosHave no idea why people care about stock power draw
Personally I dont care much for it, but as I mentioned in another thread, people dont care about it until they can use it against AMD.
fevgatosNo, not because of power consumption, because of efficiency. These 2 are different. Put a 7900xtx and a 4080 both at 250w and see which one is faster. That's efficiency. Power draw is irrelevant, a card can draw 2 kilowatts and its fine, if you can limit it to 300w and have it still be fast, no issue.
There is a point there, but conveniently, you left out why in this particular case, theres a difference and its down to the foundry node used.
londisteSlightly more than that:
Hence why I said "Properly".
oxrufiioxoI think what he meant is there are probably 5 games where RT actually meaningfully improves the visuals....
Correct, thank you.
fevgatosHaving 5 games with meaningful RT, that's a LOT.
Currently, perhaps that will grow, but for now, stands, thanks.
oxrufiioxoI do expect more and more games over time to get amazing RT
Same here but so far, 99% of the current ones rely on standing on puddles or mirrors and I personally dont find that it adds much to the gameplay, especially to justify the insane performance hit.
oxrufiioxoanyone who thinks otherwise is delusional.
I am not entirely convinced that we will ever get to that point but I could be wrong, We are now 5 years or so into the RT hype and the results are few and are still "dependent" on GPU's that goes for over US$1500. By the way, those first couple of gens RTX GPU's cant do much on the RT dept already.
oxrufiioxoton of games that the performance hit doesn't justify the visual upgrade.
Thats my main problem with this. I love the game The Ascent and enabling RT drops the fps from 250+ to 50 or so on my XTX and the only place that I can see it are on puddles, which honestly dont add anything to the gameplay.
I have other games in my backlog that might have better implementations but as said, they wont add much if anything to the gameplay.
AusWolfIMO, what we need is better surfaces on humanoids and other living things, especially in the rain, and better animations. Map detail and lighting are awesome, but humans still look and act like porcelain dolls.
Agreed and I think is worth adding, good gameplay but that is not the problem of the GPU. :)
Posted on Reply
#152
JustBenching
Neo_MorpheusPersonally I dont care much for it, but as I mentioned in another thread, people dont care about it until they can use it against AMD.
I've explained it to you twice now but you still don't get it. Amds problem isn't power draw, it's efficiency. You can limit the power draw on an amd card just like you can on an nvidia card, but it will still draw more power for the same performance.
Neo_MorpheusThere is a point there, but conveniently, you left out why in this particular case, theres a difference and its down to the foundry node used.
Who cares about the why? The end result is what matters.
Posted on Reply
#153
Visible Noise
ARFMaybe you both are trolls, no?

Which product is slow?

There must be a global professional investigation against nvidia for cheating, using dlss as the default setting, making slow GPUs appear good on the charts.

Let's compare the slow RX 7600 with the even slower RTX 4060, which is a junk, leftover byproduct, rebadge of something GT **30 class, or *50 LE class.

RTX 4060 vs RX 7600:

vs
Theoretical specs are meaningless. Modern raytraced games run like dogshit on AMD and you know it.
oxrufiioxoOnly uses software lumen on PC/Console I believe there is no way the reflections on water would look as bad as they do if it was hardware lumen otherwise the game looks fantastic and other than resolution even looks great on console.

Honestly it looks much better than a lot of games with RT......th
Ah, thanks for the correction!
Vayra86D4?! Lmao. Hellblade doesnt need it either... or ghostwire.

I see we have journeyed into green vs redu and RT fantasy land... I guess thats all this thread's gonna ginow.
I’ll just assume you haven’t played them with RT. Because you know developers have all this spare time to go back to older releases to add features that provide no benefit, right?
Posted on Reply
#154
Neo_Morpheus
fevgatosI've explained it to you twice now but you still don't get it
Same for you. I am clearly talking about the usage. Playing the words game at this point.
fevgatosAmds problem isn't power draw, it's efficiency. You can limit the power draw on an amd card just like you can on an nvidia card, but it will still draw more power for the same performance.
See? Thats what I’m talking about, it doesnt matter until it does.
fevgatosWho cares about the why? The end result is what matters.
See above.
Looking at my XTX during gaming, i see it reporting using around 350W without undervolting.
Per reviews, the 4080 is at around 305W for less performance. So I dont see an issue there neither care enough to be hounding someone on a forum for it.
But as i said, when convenient, power consumption/efficiency is indeed an issue. :peace:
Posted on Reply
#155
AusWolf
fevgatosPersonal preference. If you wanna buy a card and keep it long amd is better because 2-3 years down the line RT won't be playable on your old card even if it's an nvidia one. If you upgrade every gen nvidia is clearly the better option though.
Only if you value RT in current games enough to pay the extra.

Although, upgrading with every generation is becoming increasingly foolish, but that's another matter.
Posted on Reply
#156
JustBenching
Neo_MorpheusSame for you. I am clearly talking about the usage. Playing the words game at this point.

See? Thats what I’m talking about, it doesnt matter until it does.


See above.
Looking at my XTX during gaming, i see it reporting using around 350W without undervolting.
Per reviews, the 4080 is at around 305W for less performance. So I dont see an issue there neither care enough to be hounding someone on a forum for it.
But as i said, when convenient, power consumption/efficiency is indeed an issue. :peace:
You can't compare power draw on reviews with your pc. If you had a 4080 on your system maybe it would draw 250w with your use case. Also it's not really slower but that's irrelevant
Posted on Reply
#157
stimpy88
ChomiqLuckily RT performance is still not needed as much as pure raster performance.
Give it another year or two.
Posted on Reply
#158
TheDeeGee
oxrufiioxoI think what he meant is there are probably 5 games where RT actually meaningfully improves the visuals....

For me it's

Witcher 3 NG
Cyberpunk 2077
Alan Wake 2
Ratchet and Clank
Control
Metro Exodus EE
Spiderman
Quake 2 RTX
Portal RTX
Minecraft (haven't played this but it does look way better with RT)

Most games do it pretty terribly though RE 4 Remake is absolutely trash when it comes to RT those F1 games and countless others I'd say probably 1 in 10 games that has RT does it well and that might be generous....
Serious Sam 1st & 2nd Encounter
DOOM 1 & 2
Quake 1
Half-Life 1
Posted on Reply
#159
TheinsanegamerN
LabRat 891500W + Transient Peaks, over the 12V-2x6? :roll:

Looks like Intel is setting precedent:
It's okay to engineer products that will fail inside warranty.
Meh. 3090ti peaked at 625 watt and those didnt fail constantly.

Just pulling power isnt an issue.
fevgatosThe important question is, for how many games do you upgrade your GPU? Cause 5 is a lot. I mean I've played 10-15 games with my 4090 but half of them worked fine with my old card, I upgraded my GPU just for a couple of them. Having 5 games with meaningful RT, that's a LOT.
I usually wait until there is a game I cant play to upgrade. Last time it was Halo Infinite Multiplayer.
Posted on Reply
#160
Vayra86
Visible NoiseTheoretical specs are meaningless. Modern raytraced games run like dogshit on AMD and you know it.


Ah, thanks for the correction!


I’ll just assume you haven’t played them with RT. Because you know developers have all this spare time to go back to older releases to add features that provide no benefit, right?
I did. Its unimpressive.
Posted on Reply
#161
AusWolf
Visible NoiseI’ll just assume you haven’t played them with RT. Because you know developers have all this spare time to go back to older releases to add features that provide no benefit, right?
Do you think everything companies do these days has meaning besides completing a pointless tick box exercise? You'd be surprised.
Posted on Reply
#162
THU31
If these are correct, it could mean the 5060 will be based on the GB205 like the 5070, and both cards will have 12 GB of VRAM.

That would be great news for the 5060, but not for the 5070. 12 GB on a card that will probably cost $600+ is not acceptable in 2025.
Posted on Reply
#163
oxrufiioxo
THU31If these are correct, it could mean the 5060 will be based on the GB205 like the 5070, and both cards will have 12 GB of VRAM.

That would be great news for the 5060, but not for the 5070. 12 GB on a card that will probably cost $600+ is not acceptable in 2025.
GDDR7 does apparently come in weird configurations compared to GDDR6X so who knows what skews we will get won't be surprised if the 5060 get's gut down to 10GB so Nvidia can save a couple bux on BOM.
Posted on Reply
#164
CyberCT
I'm very happy with my 3080ti and 4090 ... installed in 2 different PCs obviously. I won't be upgrading for a long time.
Posted on Reply
#165
Godrilla
Still waiting on the Seasonic gen 2 12V-2x6 cable.
Posted on Reply
#166
Ruru
S.T.A.R.S.
ARFSince when? Have you asked AMD? And how many Radeons and intel Arcs exactly use this new low quality power connector?
Hasn't Asrock been so far the only AIB from camp AMD to use this stupid connector?
Posted on Reply
#167
LabRat 891
RuruHasn't Asrock been so far the only AIB from camp AMD to use this stupid connector?
AFAIK, yes.

Considering that nVidia wants 500+W pulled thru it, I'd imagine the 'blower style' 7900 (power-limited w/in constraints of cooling) is *not* going to be a problem.

Regardless, I'm not happy about the connector 'gaining any traction' in the market.
Posted on Reply
#168
Hankieroseman
Goto mighty, no, count da money. I got mine! $$$$. Me waiting for a new NV GPU that can match the 57" Neo G9. :pimp: It's good to be the King!
Posted on Reply
Add your own comment
Nov 28th, 2024 23:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts