Monday, July 15th 2024

NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.

The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
Sources: @Orlak29_ on X, via VideoCardz
Add your own comment

125 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector

#51
AusWolf
oxrufiioxoBoth cards are going to be too slow before the vram difference matters the 4070 is already struggling in some games and it has nothing to do with Vram..... 8 GB at 400 usd or above is stupid but the 7800XT is honestly too weak for me to care that it has 16GB the 4070ti 12 GB on the other hand was probably the GPU I liked the least but others have it and are fine so it is what it is.
Texture quality has a huge impact on visual fidelity, but a negligible impact on performance, as long as you have enough VRAM. Even if I have to run X future game on low graphics, at least I can ramp up texture quality to high with a card with more VRAM.

HW Unboxed did a test with a 4 GB and 8 GB 6500 XT recently. The 4 GB card did 45 FPS in Hogwarts Legacy at 1080p low with texture pop-ins, while the 8 GB one did 80 FPS with high textures and no pop-in. There are also games that lower your image quality to maintain good performance when they run out of VRAM.
oxrufiioxoBecause I enjoy having the FPS of a 4090 at 1440p which would take FSR to even approach with a 7900XT some games already only hit 60fps on a 4090 at 1440p UW as it is...
FSR isn't the only option. Lowering some image quality settings can also give you the performance you need without turning the picture into a blurry mess.
Posted on Reply
#52
oxrufiioxo
AusWolfFSR isn't the only option. Lowering some image quality settings can also give you the performance you need without turning the picture into a blurry mess.
Honestly dropping below high is worse than using DLSS and TAA is worse than DLSS to my eyes in most games and turning TAA off is even worse than that I hate jaggies and flickering more than anything in games and now days other than at something like 8k on a 4k display down sampled shit looks bad without either DLSS or TAA to me.

I prefer DLAA but not even a 4090 can do that in every game at 1440p... I use DLDSR at 2.25x on older games looks fantastic couldn't go back to playing games any other way...

DLSS 3.7 is pretty damn good in most games though and it takes all of 1m to upgrade the DLL....
Posted on Reply
#53
Bwaze
oxrufiioxoThe process node isn't a huge leap so the majority of any gains will have to come from larger die size, higher IPC, and clock/ram speed.

My guess is the 5090 will be 50 to 60% faster than the 4090 but even more expensive with RT being a bit higher. The 5080 matching the 4090 or slightly exceeding it 10% ish for 1200 usd does anything lower really matter not to me lol.

Oh maybe a 700-800 usd 4080 matching 5070 with 12GB of vram lmao cuz nvidia gonna be nvidia with their fanbois defending it drunk on that green Kool-aid.
Right now the difference between RTX 4080 and 4090 is (in latest reviews, average FPS):

1080p: 16.2%
1440p: 23.4%
4K: 29.8%

Even in 4K raytracing the difference isn't magically larger. So RTX 5080 with performance matching RTX 4090 would be the lowest performance increase since RTX 2080, and one of the lowest performance increases ever.

And I bet even without AI craze we wouldn't see new generation for old MSRP - this was said by Leather jacket well before AI:

"A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive," "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."

With all the AI push we could be facing pricing similar to highest crypto madness...
Posted on Reply
#54
oxrufiioxo
BwazeEven in 4K raytracing the difference isn't magically larger. So RTX 5080 with performance matching RTX 4090 would be the lowest performance increase since RTX 2080, and one of the lowest performance increases ever.
Don't get me wrong I really Hope I am wrong and it's 25-30% faster I just doubt it.
Posted on Reply
#55
AusWolf
oxrufiioxoHonestly dropping below high is worse than using DLSS and TAA is worse than DLSS to my eyes in most games and turning TAA off is even worse than that I hate jaggies and flickering more than anything in games and now days other than at something like 8k on a 4k display down sampled shit looks bad without either DLSS or TAA to me.

I prefer DLAA but not even a 4090 can do that in every game at 1440p... I use DLDSR at 2.25x on older games looks fantastic couldn't go back to playing games any other way...
Let's agree to disagree here. I'd rather go down to medium graphics with high textures than turn my picture into a blurry mess with any form of upscaling. DLSS and FSR are great as a last resort when there's no other option. But only then.

There's barely any noticeable difference among different graphical settings in most modern games anyway. The only thing that differs greatly is the performance.
Posted on Reply
#56
Assimilator
AusWolfHW Unboxed did a test with a 4 GB and 8 GB 6500 XT recently. The 4 GB card did 45 FPS in Hogwarts Legacy at 1080p low with texture pop-ins, while the 8 GB one did 80 FPS with high textures and no pop-in. There are also games that lower your image quality to maintain good performance when they run out of VRAM.
Okay and? NVIDIA isn't launching a completely gimped 4GB 64-bit VRAM PCIe x4 card.
Posted on Reply
#57
ARF
oxrufiioxoit was a bad product at 900 it's fine now for 650
Because RTX 4090 for 2 grand is a better buy? :fear:
AusWolfThat's the textbook definition of the word "ripoff".
Scalping is still alive and well. A legacy from the mining craze and covid horror times.
Posted on Reply
#58
oxrufiioxo
AusWolfLet's agree to disagree here. I'd rather go down to medium graphics with high textures than turn my picture into a blurry mess with any form of upscaling. DLSS and FSR are great as a last resort when there's no other option. But only then.

There's barely any noticeable difference among different graphical settings in most modern games anyway. The only thing that differs greatly is the performance.
Visuals are always going to be subjective and even though I disagree with some of the stuff you are saying I fully believe to your eyes that what you are saying is true.... We can both be right in our own use cases what is visually appealing to someone isn't always visually appealing to someone else.
Posted on Reply
#59
AusWolf
AssimilatorOkay and? NVIDIA isn't launching a completely gimped 4GB 64-bit VRAM PCIe x4 card.
The point is future games that might benefit from the extra VRAM at low quality settings and high textures.
oxrufiioxoVisuals are always going to be subjective and even though I disagree with some of the stuff you are saying I fully believe to your eyes that what you are saying is true.... We can both be right in our own use cases what is visually appealing to someone isn't always visually appealing to someone else.
I find higher quality settings visually appealing like any one of us, but a native res / anti-aliased image with low settings looks better than an upscaled one at high graphics 99.9% of the time, imo. I do agree with the notion of subjectivity, though. :)
Posted on Reply
#60
R0H1T
AssimilatorNVIDIA isn't launching a completely gimped 4GB 64-bit VRAM PCIe x4 card.
That's because they wouldn't want to "dilute" the brand. For all intents & purposes they are the (cr)Apple of PC industry!
Posted on Reply
#61
oxrufiioxo
ARFBecause RTX 4090 for 2 grand is a better buy? :fear:
It's a better buy if your hobby is worth 2 grand to you which mine is..... Not sure for how much longer my 18 month old is pretty damn expensive lol.... But if you cannot afford it or choose not to afford it good for you what someone else buys means jack crap to me..... Every gpu can be good for someone.... except the 4060ti 8gb no way that thing is good for anyone eh maybe esports gamers but I don't care what anyone says a 400 usd gpu with almost no generational improvments I will never get behind regardless of how good or bad the competition is at or around that price.....
AusWolfI find higher quality settings visually appealing like any one of us, but a native res / anti-aliased image with low settings looks better than an upscaled one at high graphics 99.9% of the time, imo. I do agree with the notion of subjectivity, though. :)
I get that but to me a 4k native display running a game with DLSS quality almost always looks better than a native 1440p display running TAA and they are about the same FPS so native is almost never the best option I would run every game super sampled if I could but unfortunately I cannot lol.....

Again DLSS 3.7 is pretty damn good especially at 4k and it performs better than my 1440p ultrawide at native in it's quality mode in the majority of games I have tested.... I'm really getting use to ultrawide though or else it's what I would be running.... 1440p standard honestly looks pretty meh to me these days.... Maybe I gamed at 4k for too long idk.

Although a lot of people have pretty crappy monitors so they obviously don't care much for picture quality because a good screen can make a bigger difference than a 4090 can I would take an oled and a 4070 over a 4090 and an IPS any day,
Posted on Reply
#62
Bwaze
AusWolfLet's agree to disagree here. I'd rather go down to medium graphics with high textures than turn my picture into a blurry mess with any form of upscaling. DLSS and FSR are great as a last resort when there's no other option. But only then.

There's barely any noticeable difference among different graphical settings in most modern games anyway. The only thing that differs greatly is the performance.
For me this is entirely game depending. Is the game originally designed for gaming consoles? Then it's usually designed with poor viewing distance, you focus on the bigger stuff on the screen, blurriness and inconsistencies from antialiasing / upscaling are mostly irrelevant

Are you playing something designed for PC, racing, flight Sims where you do focus on small details in the distance? That can show all the problems of these upscaling technologies and drive you crazy...
Posted on Reply
#63
Launcestonian
500W TDP for RX 5090! yikes... & I thought my RX 7900 XTX with 350W TDP was overkill. When gaming at 3440x1440p, ultra settings in Starfield, It does spike to 510W, but avgs 265w during a session.
Posted on Reply
#64
AusWolf
oxrufiioxoI get that but to me a 4k native display running a game with DLSS quality almost always looks better than a native 1440p display running TAA and they are about the same FPS so native is almost never the best option I would run every game super sampled if I could but unfortunately I cannot lol.....

Again DLSS 3.7 is pretty damn good especially at 4k and it performs better than my 1440p ultrawide at native in it's quality mode in the majority of games I have tested.... I'm really getting use to ultrawide though or else it's what I would be running.... 1440p standard honestly looks pretty meh to me these days.... Maybe I gamed at 4k for too long idk.

Although a lot of people have pretty crappy monitors so they obviously don't care much for picture quality because a good screen can make a bigger difference than a 4090 can I would take an oled and a 4070 over a 4090 and an IPS any day,
Ah, 4K is a different cattle of fish altogether! I can watch native 4K, upscaled, or even 1080p content on my 4K TV, and I will never spot any difference. There, DLSS and FSR are great, as I can play on it even with a low-end GPU.

When playing on my 1440 ultrawide monitor, though, I'll always choose lowering a few settings before enabling FSR.
Posted on Reply
#65
ARF
Launcestonian500W TDP for RX 5090! yikes... & I thought my RX 7900 XTX with 350W TDP was overkill. When gaming at 3440x1440p, ultra settings in Starfield, It does spike to 510W, but avgs 350w during a session.
Only the hardcore nvidia fans will bear this utter engineering junk. Will have 800-watt spikes, and will require 1200-watt PSUs, especially when coupled with a 300-watt intel oven. :kookoo:
Posted on Reply
#66
oxrufiioxo
AusWolfWhen playing on my 1440 ultrawide monitor, though, I'll always choose lowering a few settings before enabling FSR.
I haven't played a single game were FSR was ok at 1440p I honestly thought it was broken on Nvidia gpu I was like man it can't be this bad.... So I picked up a 6700XT just to test it out locally on an all AMD system but couldn't tell it apart from either of my Nvidia systems.

The disocclusion artifacts and flickering on fine detail are too obvious to me in the majority of what I play to use it.
Posted on Reply
#67
AusWolf
oxrufiioxoI haven't played a single game were FSR was ok at 1440p I honestly thought it was broken on Nvidia gpu I was like man it can't be this bad.... So I picked up a 6700XT just to test it out locally on an all AMD system but couldn't tell it apart from either of my Nvidia systems.
Exactly. It's a blurry mess. On the other hand, if I lower let's say, shadow quality to high or medium, I probably won't notice the difference, especially in a live game where I have better things to do than to pixel-peep, but the performance increase is usually just as tremendous.
Posted on Reply
#68
oxrufiioxo
AusWolfExactly. It's a blurry mess. On the other hand, if I lower let's say, shadow quality to high or medium, I probably won't notice the difference, especially in a live game where I have better things to do than to pixel-peep, but the performance increase is usually just as tremendous.
Honestly my favorite way to actually use DLSS is in combination with DLDSR but usually by the end of a gpu generation that isn't possible in newer games like Alan wake 2, Hellblade 2, and the last Decedent that ain't happening luckily DLSS is decent in all 3.
Posted on Reply
#69
Chomiq
Just what we needed, more power.
Posted on Reply
#70
ARF
ChomiqJust what we needed, more power.
The best would be to cancel it, and then forward-port, shrink it to TSMC 2nm when it's ready for it. While keeping the power equal to or less than 275W.
Posted on Reply
#71
nguyen
Well I hope 5090 is about 60% faster than 4090, 4090 feels kinda slow these days, especially with UE5 games.

Let see if there are new features like DLSS4 poping up also.
Posted on Reply
#72
oxrufiioxo
nguyenWell I hope 5090 is about 60% faster than 4090, 4090 feels kinda slow these days, especially with UE5 games.

Let see if there are new features like DLSS4 poping up also.
it's pretty good at 1440 UW though but you also need a really fast cpu in most newer games as well these days.... UE5 can still be pretty cpu intensive even below 100fps....
Posted on Reply
#73
stimpy88
So 5080's will join the burning 4090 club! cool! :D
nguyenWell I hope 5090 is about 60% faster than 4090, 4090 feels kinda slow these days, especially with UE5 games.

Let see if there are new features like DLSS4 poping up also.
The 4090 is what the 4080 should have been. UE5 games are just too much for the 40x0 series at 4K without DLSS cheating.

But I'm down for upgrading my ancient 2070, as I decided that the 4080 was simply too slow and just terrible value. I just hope nGreedia doesn't crank up the prices again, but a 5080 is in my sights. I bet they do though, at least another $100 per sku, and probably another $300 for the 5090.

But with the DLSS .DLLs getting to version 3.7+, I'd say you are right about the idea of DLSS 4.0 in half a year's time, but whether it requires a real or imaginary feature of the 50x0 series will be interesting, as we all know the lies nGreedia will make up for justifying a feature to certain hardware in the past.
Posted on Reply
#74
fevgatos
AusWolfLet's agree to disagree here. I'd rather go down to medium graphics with high textures than turn my picture into a blurry mess with any form of upscaling. DLSS and FSR are great as a last resort when there's no other option. But only then.

There's barely any noticeable difference among different graphical settings in most modern games anyway. The only thing that differs greatly is the performance.
Try horizon zero dawn. Yeah the old one. Medium just destroyes the graphics, it removes all shadows etc. I used fsr high instead of native medium on my laptop, looked way better
Posted on Reply
#75
nguyen
stimpy88So 5080's will join the burning 4090 club! cool! :D


The 4090 is what the 4080 should have been. UE5 games are just too much for the 40x0 series at 4K without DLSS cheating.

But I'm down for upgrading my ancient 2070, as I decided that the 4080 was simply too slow and just terrible value. I just hope nGreedia doesn't crank up the prices again, but a 5080 is in my sights. I bet they do though, at least another $100 per sku, and probably another $300 for the 5090.

But with the DLSS .DLLs getting to version 3.7+, I'd say you are right about the idea of DLSS 4.0 in half a year's time, but whether it requires a real or imaginary feature of the 50x0 series will be interesting, as we all know the lies nGreedia will make up for justifying a feature to certain hardware in the past.
I can assure you that 5090 will just be barely fast enough when devs push for higher fidelity in next gen games. It has been a constant cycle of hardware and software one up each other since forever. If you like high FPS or better efficiency, better turn on that DLSS cheat ;)
Posted on Reply
Add your own comment
Jul 15th, 2024 23:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts