Monday, July 15th 2024
NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector
In the preparation season for NVIDIA's upcoming GeForce RTX 50 Series of GPUs, codenamed "Blackwell," one power supply manufacturer accidentally leaked the power configurations of all SKUs. Seasonic operates its power supply wattage calculator, allowing users to configure their systems online and get power supply recommendations. This means that the system often gets filled with CPU/GPU SKUs to accommodate the massive variety of components. This time we have the upcoming GeForce RTX 50 series, with RTX 5050 all the way up to the top RTX 5090 GPU. Starting with the GeForce RTX 5050, this SKU is expected to carry a 100 W TDP. Its bigger brother, the RTX 5060, bumps the TDP to 170 W, 55 W higher than the previous generation "Ada Lovelace" RTX 4060.
The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
Sources:
@Orlak29_ on X, via VideoCardz
The GeForce RTX 5070, with a 220 W TDP, is in the middle of the stack, featuring a 20 W increase over the Ada generation. For higher-end SKUs, NVIDIA prepared the GeForce RTX 5080 and RTX 5090, with 350 W and 500 W TDP, respectively. This also represents a jump in TDP from Ada generation with an increase of 30 W for RTX 5080 and 50 W for RTX 5090. Interestingly, this time NVIDIA wants to unify the power connection system of the entire family with a 16-pin 12V-2x6 connector but with an updated PCIe 6.0 CEM specification. The increase in power requirements for the "Blackwell" generation across the SKUs is interesting, and we are eager to see if the performance gains are enough to balance efficiency.
168 Comments on NVIDIA GeForce RTX 50 Series "Blackwell" TDPs Leaked, All Powered by 16-Pin Connector
HW Unboxed did a test with a 4 GB and 8 GB 6500 XT recently. The 4 GB card did 45 FPS in Hogwarts Legacy at 1080p low with texture pop-ins, while the 8 GB one did 80 FPS with high textures and no pop-in. There are also games that lower your image quality to maintain good performance when they run out of VRAM. FSR isn't the only option. Lowering some image quality settings can also give you the performance you need without turning the picture into a blurry mess.
I prefer DLAA but not even a 4090 can do that in every game at 1440p... I use DLDSR at 2.25x on older games looks fantastic couldn't go back to playing games any other way...
DLSS 3.7 is pretty damn good in most games though and it takes all of 1m to upgrade the DLL....
1080p: 16.2%
1440p: 23.4%
4K: 29.8%
Even in 4K raytracing the difference isn't magically larger. So RTX 5080 with performance matching RTX 4090 would be the lowest performance increase since RTX 2080, and one of the lowest performance increases ever.
And I bet even without AI craze we wouldn't see new generation for old MSRP - this was said by Leather jacket well before AI:
"A 12-inch wafer is a lot more expensive today than it was yesterday, and it's not a little bit more expensive, it is a ton more expensive," "Moore's Law is dead … It's completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past."
With all the AI push we could be facing pricing similar to highest crypto madness...
There's barely any noticeable difference among different graphical settings in most modern games anyway. The only thing that differs greatly is the performance.
Again DLSS 3.7 is pretty damn good especially at 4k and it performs better than my 1440p ultrawide at native in it's quality mode in the majority of games I have tested.... I'm really getting use to ultrawide though or else it's what I would be running.... 1440p standard honestly looks pretty meh to me these days.... Maybe I gamed at 4k for too long idk.
Although a lot of people have pretty crappy monitors so they obviously don't care much for picture quality because a good screen can make a bigger difference than a 4090 can I would take an oled and a 4070 over a 4090 and an IPS any day,
Are you playing something designed for PC, racing, flight Sims where you do focus on small details in the distance? That can show all the problems of these upscaling technologies and drive you crazy...
When playing on my 1440 ultrawide monitor, though, I'll always choose lowering a few settings before enabling FSR.
The disocclusion artifacts and flickering on fine detail are too obvious to me in the majority of what I play to use it.
Let see if there are new features like DLSS4 poping up also.
But I'm down for upgrading my ancient 2070, as I decided that the 4080 was simply too slow and just terrible value. I just hope nGreedia doesn't crank up the prices again, but a 5080 is in my sights. I bet they do though, at least another $100 per sku, and probably another $300 for the 5090.
But with the DLSS .DLLs getting to version 3.7+, I'd say you are right about the idea of DLSS 4.0 in half a year's time, but whether it requires a real or imaginary feature of the 50x0 series will be interesting, as we all know the lies nGreedia will make up for justifying a feature to certain hardware in the past.