Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#76
JustBenching
DavenJust buy a 5080 and save $1000+. The performance between a 5090 at 320W and a 5080 at 360W is going to be about the same.
But i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
Posted on Reply
#77
freeagent
Well, there is over 1GHz difference in core speed on my 4070Ti vs 3070Ti, more cache, and other tweaks, I am sure they have good AI to design these things :D
Posted on Reply
#78
AusWolf
JustBenchingBut i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
Sure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
Posted on Reply
#79
freeagent
AusWolfSure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
Yes, because when running a GPU like this, your only concern is if you have a big enough PSU
Posted on Reply
#80
Daven
AusWolfSure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
I think the confusion lies in the process nodes. The move from the 2000 series to the 3000 series was horrible because of the 8 nm Samsung node. The situation greatly improved with the 4000 series on the 4 nm TSMC node. That's why the 4090 is such a great performer. But now the 2000 to 3000 series situation is happening again as the 5000 series is on the same 4 nm TSMC node. Efficiency can only go down if you add 30% more transistors to the same process node unless you greatly decrease the clock speed which negates any performance improvements over the previous generation.
Posted on Reply
#81
AusWolf
freeagentYes, because when running a GPU like this, your only concern is if you have a big enough PSU
The same way when you're buying a Ferrari, your only concern is whether you have space for it in your garage next to your other Ferraris? Um, maybe.

I'm still thinking that if a 5090 performs at 100%, and a 5080 at 320 W performs at 50%, and you can get 60% by running your 5090 at 320 W, then the other 40% is wasted money.

Edit: Then, you basically paid double price for 20% more performance.
DavenI think the confusion lies in the process nodes. The move from the 2000 series to the 3000 series was horrible because of the 8 nm Samsung node. The situation greatly improved with the 4000 series on the 4 nm TSMC node. That's why the 4090 is such a great performer. But now the 2000 to 3000 series situation is happening again as the 5000 series is on the same 4 nm TSMC node. Efficiency can only go down if you add 30% more transistors to the same process node unless you greatly decrease the clock speed which negates any performance improvements over the previous generation.
I completely agree, although this wasn't my question.
Posted on Reply
#82
TheinsanegamerN
Bomby569Doom is mostly a shooter on rails, or by sections, they couldn't pull that off in a open world for example. Not to say they didn't do a great job but not all games are equal.
*cough cough* RAGE *cough cough*
AusWolfI agree. I want to see speed increases due to advancements in GPU architecture, like I did in the Pascal years, and not due to cramming more parts into a chip and increasing power (what I call brute forcing).
Pascal's main advancements came from cramming significantly more transistors onto a chip with a higher power limit, specially given maxwell was its predecessor. Rose colored glasses and all that.
Posted on Reply
#83
AusWolf
TheinsanegamerNPascal's main advancements came from cramming significantly more transistors onto a chip with a higher power limit, specially given maxwell was its predecessor. Rose colored glasses and all that.
Well, Maxwell wasn't a bad architecture, either, imo... but I get what you mean.
Posted on Reply
#84
TheinsanegamerN
AusWolfThe same way when you're buying a Ferrari, your only concern is whether you have space for it in your garage next to your other Ferraris? Um, maybe.

I'm still thinking that if a 5090 performs at 100%, and a 5080 at 320 W performs at 50%, and you can get 60% by running your 5090 at 320 W, then the other 40% is wasted money.

Edit: Then, you basically paid double price for 20% more performance.
Except its usually the opposite, the 5090 at 320w would be putting out 60%, while the 5080 at 320w would be putting out 50%.

Besides, if you are buying the 5090, it's because the 5080 isnt enough for what you want. For most who want high end hardware, drawing 525w isnt a concern. The high end has always had huge power draw (hello SLI era).
AusWolfWell, Maxwell wasn't a bad architecture, either, imo... but I get what you mean.
No it wasnt bad. It was great. My point was that overall most GPU generations are defined by MOAR COARS and more power, with power being offset by smaller nodes. IPC is far less important to GPUS then it is CPUs, parallelism and clock speeds make a much larger difference. Been true for a long time.
Posted on Reply
#85
PixelTech
Will I be chill with a 1000W ATX 3.1 PSU for the 5090? (paired with 7800X3D)
Posted on Reply
#86
igormp
DavenJust buy a 5080 and save $1000+. The performance between a 5090 at 320W and a 5080 at 360W is going to be about the same. Maybe and this is a big maybe, the 5090 will be a little faster but don't forget that Nvidia is using the same node as the 4000 series. This means efficiency of the 5000 series will go down as more transistors are added.

This is a buyer beware situation and no company logo on the box beats physics.
You seem to be under the assumption that performance scales linearly with power.
A 5090 at a lower power budget than a 5080 is still going to have almost double the memory bandwidth, and way more cores, even if those are clocked lower.
AusWolfThe same way when you're buying a Ferrari, your only concern is whether you have space for it in your garage next to your other Ferraris? Um, maybe.

I'm still thinking that if a 5090 performs at 100%, and a 5080 at 320 W performs at 50%, and you can get 60% by running your 5090 at 320 W, then the other 40% is wasted money.

Edit: Then, you basically paid double price for 20% more performance.


I completely agree, although this wasn't my question.
Your assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.
Posted on Reply
#87
Kaleid
PixelTechWill I be chill with a 1000W ATX 3.1 PSU for the 5090? (paired with 7800X3D)
Yeah, CPU is super efficient
Posted on Reply
#88
Hecate91
AusWolfShould I write "in my opinion" in front of every post I make? :confused:

I am an Nvidia user, by the way, just not in my main gaming rig at the moment. I've got two HTPCs that both have Nvidia GPUs in them. Does that make me more qualified to comment here?
It seems to be getting to that point, people take the system specs too seriously,lol.
AusWolfLet me disagree there. The 5090 has double of everything compared to the 5080 (shaders, VRAM, etc) which is already gonna be a stupidly expensive card. The 5090 is only GeForce by name to sell it to gamers. But it is not a card that your average gamer needs. Otherwise, there wouldn't be such a gigantic gap between it and the 5080 in specs.
The 5090 is more of an RTX A series card than a Geforce card, double the shaders and VRAM also likely means double the price as well, I doubt Jensen is going to be generous since businesses bought up the 4090.
Maybe its just me missing the pricing structure of Pascal, there was only a $200 difference between x80 and x80Ti, the Titan XP wasn't something gamers with money to waste were buying.
Posted on Reply
#89
JustBenching
AusWolfSure, a 5090 at 320 W will probably be a little bit faster than a 5080 at 320 W, but is it worth the massive difference in price?
I don't think the performance loss from dropping to 320w will be even 5%. It's the same with CPUs, you can push 50% extra power for single digits performance.

Im currently running 320w with clocked memory and it's around 2-3% faster than stock 450w, so I don't think the 5090 will be any different.
Posted on Reply
#90
AusWolf
TheinsanegamerNthe 5090 at 320w would be putting out 60%, while the 5080 at 320w would be putting out 50%.
That's exactly what I said.
TheinsanegamerNBesides, if you are buying the 5090, it's because the 5080 isnt enough for what you want. For most who want high end hardware, drawing 525w isnt a concern. The high end has always had huge power draw (hello SLI era).
That's what I think so, too. If the 5080 isn't enough, I'm not gonna spend double, and then limit my 5090 to be only a little bit faster than the 5080. It's a huge waste of money.
TheinsanegamerNNo it wasnt bad. It was great. My point was that overall most GPU generations are defined by MOAR COARS and more power, with power being offset by smaller nodes. IPC is far less important to GPUS then it is CPUs, parallelism and clock speeds make a much larger difference. Been true for a long time.
Then why do we have massive differences between GPUs such as the 5700 XT vs the Vega 64, the former of which was faster with only 62% of the cores, despite having not much of a clock speed difference?
Posted on Reply
#91
JustBenching
igormpYour assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.
10-20 is still huge, I don't think it will be over 5% honestly
Posted on Reply
#92
AusWolf
JustBenchingI don't think the performance loss from dropping to 320w will be even 5%. It's the same with CPUs, you can push 50% extra power for single digits performance.

Im currently running 320w with clocked memory and it's around 2-3% faster than stock 450w, so I don't think the 5090 will be any different.
May you be right, then. ;)
Hecate91It seems to be getting to that point, people take the system specs too seriously,lol.
Does it matter, though? Can current AMD users not have an opinion on an Nvidia card and vice versa? Do people sign their souls away when they choose Coca-Cola instead of Pepsi one day? I don't think so.
Hecate91The 5090 is more of an RTX A series card than a Geforce card, double the shaders and VRAM also likely means double the price as well, I doubt Jensen is going to be generous since businesses bought up the 4090.
Maybe its just me missing the pricing structure of Pascal, there was only a $200 difference between x80 and x80Ti, the Titan XP wasn't something gamers with money to waste were buying.
Exactly. But now, Nvidia wants even gamers to buy the Titan, ehm... x90 card, despite its price.
Posted on Reply
#93
igormp
JustBenching10-20 is still huge, I don't think it will be over 5% honestly
Yeah, I'm assuming a worst case scenario just to be safe.
Posted on Reply
#94
JustBenching
igormpYeah, I'm assuming a worst case scenario just to be safe.
Just tested it in CP2077, 73 fps @ 440-450 watts, 71 fps @ 320w. That's at 4k native
Posted on Reply
#95
AusWolf
JustBenchingJust tested it in CP2077, 73 fps @ 440-450 watts, 71 fps @ 320w. That's at 4k native
That's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
Posted on Reply
#96
Soul_
londisteHaven't the official/public TDP numbers been technically TGPs - as in whole card consumption - for a while now? For both AMD and Nvidia, the power consumption numbers measured in reviews are within measuring error of power limit that is set to the TDP. There was a point where GPU manufacturers tried to make things complicated but that did not last long.
TGP is only for the GPU chip. TBP is Total Board power.
Posted on Reply
#97
JustBenching
AusWolfThat's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
I think the 450w makes the card faster in other workloads (than restricted at 320w) but as much as ive tested, games seem to be restricted from memory bandwidth, so they don't scale that much with power. If there are non gaming workloads that don't benefit from memory as much, I guess the 450w will give better performance. Still, i don't expect anything over 10% in either case. What's nvidia thinking? Probably the same thing intel is thinking when they decide to ship CPUs at 400 watts :D

Ocing vram gives me ~8-9% performance, overclocking the core to 3000mhz gives me ~2%.
Posted on Reply
#98
igormp
AusWolfThat's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
So that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:

benchmarks.andromeda.computer/videos/3090-power-limit?suite=language

That has been the case since... always. Here's another example with a 2080ti:

timdettmers.com/2023/01/30/which-gpu-for-deep-learning/#Power_Limiting_An_Elegant_Solution_to_Solve_the_Power_Problem

Games often don't really push a GPU that hard, so the consumption while playing is really lower than the actual limit.
Posted on Reply
#99
AusWolf
igormpSo that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:
What that diagram tells me is that the 3090 should be a 250-260 Watt card. There is no need for it to eat more than that out-of-the-box. Overclockers would be happy with that, too.
Posted on Reply
#100
igormp
AusWolfWhat that diagram tells me is that the 3090 should be a 250-260 Watt card. There is no need for it to eat more than that out-of-the-box. Overclockers would be happy with that, too.
Haven't you noticed how overclocking has diminished in popularity lately? Manufacturers are pushing components with higher clocks (and power, as a consequence) out of the box to try and get an edge and bigger numbers for marketing reasons.
Consumers have already shown they don't care about sane power consumption, they want that extra performance out of the box. Just look at what happened to the 9000 series from AMD, where they had to push for a bios with a higher default TDP to appease their consumers. Or Intel, where most people didn't give a damn about the great efficiency vs the previous gen.
Posted on Reply
Add your own comment
Jan 7th, 2025 02:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts