Thursday, September 5th 2024
NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs
NVIDIA is on the brink of finalizing its next-generation "Blackwell" graphics cards, the GeForce RTX 5090 and RTX 5080. Sources close to BenchLife indicate that NVIDIA is targeting September for the official design specification finalization of both models. This timeline hints at a possible unveiling at CES 2025, with a market release shortly after. The RTX 5090 is rumored to boast a staggering 550 W TGP, a significant 22% increase from its predecessor, the RTX 4090. Meanwhile, the RTX 5080 is expected to draw 350 W, a more modest 9.3% bump from the current RTX 4080. Interestingly, NVIDIA appears to be developing "D" variants for both cards, which are likely tailored for the Chinese market to comply with export regulations.
Regarding raw power, the RTX 5090 is speculated to feature 24,576 CUDA cores paired with 512-bit GDDR7 memory. The RTX 5080, while less mighty, is still expected to pack a punch with 10,752 CUDA cores and 256-bit GDDR7 memory. As NVIDIA prepares to launch these powerhouses, rumors suggest the RTX 4090D may be discontinued by December 2024, paving the way for its successor. We are curious to see how the power consumption is handled and if these cards are packed efficiently within the higher power envelope. Some rumors indicate that the RTX 5090 could reach 600 watts at its peak, while RTX 5080 reaches 400 watts. However, that is just a rumor for now. As always, until NVIDIA makes an official announcement, these details should be taken with a grain of salt.
Sources:
BenchLife, via Wccftech
Regarding raw power, the RTX 5090 is speculated to feature 24,576 CUDA cores paired with 512-bit GDDR7 memory. The RTX 5080, while less mighty, is still expected to pack a punch with 10,752 CUDA cores and 256-bit GDDR7 memory. As NVIDIA prepares to launch these powerhouses, rumors suggest the RTX 4090D may be discontinued by December 2024, paving the way for its successor. We are curious to see how the power consumption is handled and if these cards are packed efficiently within the higher power envelope. Some rumors indicate that the RTX 5090 could reach 600 watts at its peak, while RTX 5080 reaches 400 watts. However, that is just a rumor for now. As always, until NVIDIA makes an official announcement, these details should be taken with a grain of salt.
88 Comments on NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs
I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.
But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios. Its more depending on workload and where game X or Y ends up on your V/F curve. You just don't notice it much.
And that's fine. But stop living in denial. TDP go up, your power bill go up, simple.
It comes down to the simple idea that you do not upgrade to play the same content at the same settings as you used to. You upgrade to play content at higher settings or FPS than you could before. But you're still just gaming. Ergo, the increased efficiency almost never leads to power saving, and far more likely leads to using more power.
Power for the same task is the metric here. Let's not move the goalposts to similar.
Not to mention a more efficient GPU will use less power for a V-Sync 60 Hz situation which is the most directly comparable. Regardless of TDP.
As you can see, (much) slower GPUs often use (much) more power.
If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh
But yes, this is why Raptor Lake sold well, it's extremely fast and people don't really care about CPU power draw unless they're overheating, the power draw is getting in the way of performance, or they're using a laptop.
I didn't think Nvidia allowed board-partners to do, but I do not remember if it was a "custom" chinese job.