Monday, July 15th 2024
NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector
In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.
The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
Sources:
@Orlak29_ on X, via VideoCardz
The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
168 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector
Have no idea why people care about stock power draw, just change it in 5 seconds, seriously who really cares?
Personally, I hated AMD pricing and naming though. Each one should’ve been named a tier less and priced accordingly.
Example the 7900 XTX should’ve been 7800 XT and priced at 799.
Then again, in many raster games (you know the ones that compromise like 90% of the whole Steam catalog) the 7900 xtx is on average 20-30% slower than the 4090 but priced up to 60% less, depending on vendors, rebates and packed in games.
Mine boost to almost 2900mhz stock and stays there is my guess why there is some slight performance drop at 350w it sticks around 2600mhz
Rather perf increase at the expense of power consumption. Really, don't holding my breath.
It does flip back and fourth depending on game played though and if I'm running path tracing or pure rasterization.
But it's definitely not only their fault. It's the whole market against them. When we desperately need their healthy competition.
The “problems “ at least for gamers, is the gimmicks, like fsr, dlss and RT.
FSR and dlss plus fake frames are there to cheat over native rendering and hide their perhaps limited performance at the given resolutions.
And RT hype/push is simply the influencers (formerly known as reviewers) earning their free 4090s. We have less than 5 games that properly use RT yet we are made believe otherwise.
AMD doesn't innovate, it simply follows the nvidia leadership. This is in fact duopoly, with all the legal consequences that arise from this. I don't listen to their marketing BS. I use the classic approach - use low and medium setting where needed to achieve high enough FPS.
The 5090 will likely be 80-100% faster than the 3090ti at the same or lower power though so anyone who want's to pay for it can still get massive performance per watt improvements.
People forget that less than 10 years ago, the company was almost dead (thanks in huge part to intel dirty and illegal actions) so they bet everything on Ryzen and are now building up Radeon and other stuff.
You might say cuda but remember that amd bet on opencl and everyone bailed, ngreedia sabotaged it in favor of cuda, etc.
So it does takes time and they simply need a bit more. Sadly, you and I are a minority in that train of thought
Meanwhile, on CPU, where a real competitor exists and both are REALLY trying to make the best cpus, we see lots of progress irrespective of the node. From chiplet, to interconnects, big little, X3D... and then we see that power really doesnt have to keep going up. Except when a design isnt quite a fix, such as Intel's E cores, do we see how far the powerbudget needs to go to remain competitive.
And hey, look at GPU. Even upscale could be considered such a new tech. And it does change the playing field. Too bad Nvidia enforces a VRAM/bandwidth product limitation and segmentation plus an RT push to make you believe otherwise.
Amd has also stagnated when it comes to cores or at the very least cores per tier the 9900X really should be the R7 with the 9700X being the R5.....
MT performance in general has been nice an 3Dvcache keeps amd competitive with Intel in gaming but intel hasn't shipped a new desktop arch since 2022 and amd is barely faster depending on the gaming suite benchmarked not really impressive to me
To me it feels like decent pricing after launch ofc has been the only real winner with cpus at launch they've all been overpriced as well.
It's really just the 500 and under market that has really gone to shite with graphic cards and pricing in general not actual improvements at the top generation after generation.
chipsandcheese.com/2024/07/09/qualcomms-oryon-core-a-long-time-in-the-making/
Honestly it was so bad outside of gaming the headache of the 7950X3D become very appealing.
Hopefully 9000 or 11000 fixes that but I guess until Arrow Lake is shown it's hard to know, it looks like it will also not see very impressive MT boost.... But if the 9700X loses to a 14700k in MT that is pretty embarrassing considering how old the Raptor lake core is if they still want to price it like an i7.... Now if AMD shifts down pricing to say $329 sure then its fine
At the very least I think both you and me can agree they really need to come up with something hardware agnostic that is much better than TAA at the same performance hit lol.....