• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 and RTX 5080 Reach Final Stages This Month, Chinese "D" Variant Arrives for Both SKUs

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,999 (1.07/day)
NVIDIA is on the brink of finalizing its next-generation "Blackwell" graphics cards, the GeForce RTX 5090 and RTX 5080. Sources close to BenchLife indicate that NVIDIA is targeting September for the official design specification finalization of both models. This timeline hints at a possible unveiling at CES 2025, with a market release shortly after. The RTX 5090 is rumored to boast a staggering 550 W TGP, a significant 22% increase from its predecessor, the RTX 4090. Meanwhile, the RTX 5080 is expected to draw 350 W, a more modest 9.3% bump from the current RTX 4080. Interestingly, NVIDIA appears to be developing "D" variants for both cards, which are likely tailored for the Chinese market to comply with export regulations.

Regarding raw power, the RTX 5090 is speculated to feature 24,576 CUDA cores paired with 512-bit GDDR7 memory. The RTX 5080, while less mighty, is still expected to pack a punch with 10,752 CUDA cores and 256-bit GDDR7 memory. As NVIDIA prepares to launch these powerhouses, rumors suggest the RTX 4090D may be discontinued by December 2024, paving the way for its successor. We are curious to see how the power consumption is handled and if these cards are packed efficiently within the higher power envelope. Some rumors indicate that the RTX 5090 could reach 600 watts at its peak, while RTX 5080 reaches 400 watts. However, that is just a rumor for now. As always, until NVIDIA makes an official announcement, these details should be taken with a grain of salt.



View at TechPowerUp Main Site | Source
 
Everyone will go ape about the TDP, but forget how crazy efficient NVIDIA is.

I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.
 
Low quality post by Caring1
Based on stereotypes I'm assuming the Chinese D is a lot smaller in size?
:roll:
 
Everyone will go ape about the TDP, but forget how crazy efficient NVIDIA is.

I run my RTX 4070 Ti at 175 Watt (Default 265) and only lost 5-6% performance.
Ada was fine, Ampere was certainly not efficient, and Blackwell seems to be working along the lines of Ada.

But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios. Its more depending on workload and where game X or Y ends up on your V/F curve. You just don't notice it much.
 
Ada was fine, Ampere was certainly not efficient, and Blackwell seems to be working along the lines of Ada.
Compared to what? The competition that could draw over 600 watts and was still slower? That uses 30-50 W more in V-Sync, video playback and multi monitor (still not fixed).
But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios.
More power is irrelevant, work done/power used is only metric re efficiency. Ada is also extremely efficient at stock, but undervolts well too.
 
More power is irrelevant, work done/power used is only metric re efficiency. Ada is also extremely efficient at stock, but undervolts well too.
Of course its not irrelevant, power is heat and $$$, efficiency is how it translates to FPS. The fact is, if you have more performance under the hood you will use it and therefore use more power. But you're still just gaming.

And that's fine. But stop living in denial. TDP go up, your power bill go up, simple.
 
Of course its not irrelevant, power is heat and $$$, efficiency is how it translates to FPS. The fact is, if you have more performance under the hood you will use it and therefore use more power. But you're still just gaming.
It's completely irrelevant when you're talking about efficiency. The 350 W 4080S is much more efficient than the 150 W 6600XT. A 1000 W GPU could be the most efficient in the world.
 
It's completely irrelevant when you're talking about efficiency. The 350 W 4080S is much more efficient than the 200 W 6600XT. A 1000 W GPU could be the most efficient in the world.
But I'm not, I'm clearly talking about thermal design power and how the bar is raised from gen to gen, and how this results in effectively using more power to do a similar task, regardless of increased efficiency.

It comes down to the simple idea that you do not upgrade to play the same content at the same settings as you used to. You upgrade to play content at higher settings or FPS than you could before. But you're still just gaming. Ergo, the increased efficiency almost never leads to power saving, and far more likely leads to using more power.
 
But I'm not, I'm clearly talking about thermal design power and how the bar is raised from gen to gen, and how this results in effectively using more power to do a similar task, regardless of increased efficiency.
I count efficiency several times in your wording, not thermal design power.

Power for the same task is the metric here. Let's not move the goalposts to similar.

Not to mention a more efficient GPU will use less power for a V-Sync 60 Hz situation which is the most directly comparable. Regardless of TDP.
 
I count efficiency several times in your wording, not thermal design power.

Power for the same task is the metric here. Let's not move the goalposts to similar.
I'll set my own goalposts as I did since the first response. Read carefully. We don't disagree. You just gloss over the fact that better GPUs are not bought to save power, but rather to play games at higher settings.
 
The fact is, if you have more performance under the hood you will use it and therefore use more power.
This is categorically false.

energy-efficiency-2.png

power-vsync-6.png


As you can see, (much) slower GPUs often use (much) more power.
 
People who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.

If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh
 
What's with the big gap? I mean, the 5090 will have 2.4x as many cores as the 5080? Why?
 
People who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.
Believe it or not, even at stock you can have the fastest AND the most efficient, the two tend to go hand in hand in TDP limited scenarios which is most. The mark of a good architecture is efficiency.

But yes, this is why Raptor Lake sold well, it's extremely fast and people don't really care about CPU power draw unless they're overheating, the power draw is getting in the way of performance, or they're using a laptop.
 
Ada was fine, Ampere was certainly not efficient, and Blackwell seems to be working along the lines of Ada.

But it does have a TDP bump. So it uses more power. Stop kidding yourself. The fact you can undervolt a GPU does not mean its more efficient, you just limited it. I'm running my 7900XT efficiently too, but it can still guzzle north of 300W, and you are dreaming if you think you only lose 5-6% in worst case scenarios. Its more depending on workload and where game X or Y ends up on your V/F curve. You just don't notice it much.
Tested in various games (even RTX Remix mods with Path Tracing), and it's always 5-6%.
 
Tested in various games (even RTX Remix mods with Path Tracing), and it's always 5-6%.
In line with my own testing. With Ampere though it doesn't UV as well as Ada, since the GDDR6X is hungry and the Samsung node isn't as good as the TSMC. I suspect the RDNA 3 cards don't UV well either since the multi chiplet design.
 
What's with the big gap? I mean, the 5090 will have 2.4x as many cores as the 5080? Why?

Well, because, there will be in-betweens, the 5080 Ti, The 5080 Super, the 5080 Super Duper, the 5080 Hyper Speed, the 5080 GTFO etc. :roll:
 
People who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.

If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh
Why not both?
 
I'm not worried about power consumption - the way this is going, RTX 50X0 generation won't be targeted to gamers at all, except for maybe lowest tier cards - this is all gor the new saviour of revenue, AI! I expect prices to be even worse than during cryptomadness.
 
That seems like such a wide gulf between the 5090 and 5080, unless the 5080 is a quarter of the price... it won't seem worth it
 
People who buy these don't care about efficiency, they care about having the best and gaming at the highest settings they can. I don't get people buying a 4090 and undervolting it, you bought the best GPU nvidia made and strangle it so it uses less power. If you cared about power, you'd have a gtx1060 in your rig, not one that sucks 600w. like vegans saying i don't use any part of an animal, and wearing leather boots.

If you can afford a 5090 you can afford to pay the power bill surely, ultra gaming PC's are not made/built to be efficiant, they are supposed to be a power house meant to monster the latest games at vhigh res/refresh

Agreed. I'm wanting a 5090 and electricity usage isn't even on my list of concerns. Even if it draws 500 watts constantly in gaming (which it won't) I game about 20 hours a week on average. At 14 cents per kWh for me that comes to $6 a month. Probably the real cost will be much less but even worst case I'm looking at buying a $2,000 card so $4-$6 a month is trivial all things considered. I may have to deny myself 1 cup of Starbucks coffee a month to cover that extra expense. :p
 
Agreed. I'm wanting a 5090 and electricity usage isn't even on my list of concerns. Even if it draws 500 watts constantly in gaming (which it won't) I game about 20 hours a week on average. At 14 cents per kWh for me that comes to $6 a month. Probably the real cost will be much less but even worst case I'm looking at buying a $2,000 card so $4-$6 a month is trivial all things considered. I may have to deny myself 1 cup of Starbucks coffee a month to cover that extra expense. :p
You have extremely cheap electricity, mine is triple that at least. Oops, I meant over double, not quite triple yet, despite the constant increases
 
Last edited:
  • Wow
Reactions: 64K
I also seen on X that RTX4090 D og RTX 4080 variant with double the amount of memory.

I didn't think Nvidia allowed board-partners to do, but I do not remember if it was a "custom" chinese job.
 
That seems like such a wide gulf between the 5090 and 5080, unless the 5080 is a quarter of the price... it won't seem worth it
You need to know both prices before determining which one is worth it.
 
Well, because, there will be in-betweens, the 5080 Ti, The 5080 Super, the 5080 Super Duper, the 5080 Hyper Speed, the 5080 GTFO etc. :roll:
Why not just call them 5081, 5082, 5083, etc.? Zero is not the only number in existence, surely. :roll:
 
Back
Top