Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

206 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#51
freeagent
AusWolfIt's totally excessive and unnecessary, just like a 4090 is for most people.
Why does it matter to you though?

You made your point about sticking with AMD midrange and Linux..

5090 isn't made for "gamers"
Posted on Reply
#52
JustBenching
DavenThe 5090 is almost double the power usage of my 7900XT which already heats up my tiny gaming room. The room would be hotter than a sauna with the 5090!
At the same game with the same settings and same fps the 5090 will likely need half the power of your 7900xt though. That's what matters when it comes to power, it's called efficiency.
Posted on Reply
#53
Hecate91
DavenThe 5090 is almost double the power usage of my 7900XT which already heats up my tiny gaming room. The room would be hotter than a sauna with the 5090!
A 600W GPU is ridiculous unless its undervolted, but I expect 5090 buyers won't care and probably can afford the AC to handle the heat load. It's also silly to me how the 5090 might have 2x 16 pin connectors, retracting from the whole point of having a more "convenient" power connector.
JustBenchingTo me it's like asking "who's gonna buy a TV that comes with 100% brightness out of the box".
Honestly, probably most people do. However with a graphics cards I shouldn't need to undervolt and tweak the power usage out of the box, manufacturers are pushing parts too hard if I have to undervolt things right away in order for the power use to be reasonable.
Posted on Reply
#54
AusWolf
Bomby569just pick any game and see the frame rate drop when you exit a building or enter a large open world area
That's a .1% hitch. Nothing to write home about, imo.
freeagentWhy does it matter to you though?
It doesn't. I answered a question. The hostility... Geez. :confused:

Should everybody who doesn't intend to buy a 5090 just shut the F up and leave? :confused:
Posted on Reply
#55
Bomby569
Hecate91It's also silly to me how the 5090 might have 2x 16 pin connectors, retracting from the whole point of having a more "convenient" power connector.
it's is more convenient than a 4 or 5 * 8 pin connector :D
Posted on Reply
#56
freeagent
AusWolfThe hostility...
None at all mate, I was just curious that's all :)
Posted on Reply
#57
JustBenching
Hecate91Honestly, probably most people do. However with a graphics cards I shouldn't need to undervolt and tweak the power usage out of the box, manufacturers are pushing parts too hard if I have to undervolt things right away in order for the power use to be reasonable.
Nobody said anything about undervolting. I agree you shouldn't need to undervolt out of the box, but you don't need to. But power usage, why not? Nvidia, amd, intel and whatever other company that exists CANNOT by definition know how you want to use a product better than you do, therefore they cannot ship it with settings that fit your needs. Same applies to most products, be it TVs, ACs, monitors etc.

Some GPUs have a dual bios switch for noise / performance but still it's just 2 options, it can't cast a wide enough net. The same way that laptops come with turbo / silent / performance modes etc. Should I be complaining that my laptop came set to turbo out of the box? Who cares, it takes a second to change it
Posted on Reply
#58
AusWolf
freeagentNone at all mate, I was just curious that's all :)
Then please be careful with words. "What does it matter to you" and "you made your point" sure sound hostile.

Sometimes, people come here out of some genuine interest for tech, not because they have a personal stake in the matter. Not everybody in a 5090 thread is a potential buyer. Sometimes, one is just curious. :)
Posted on Reply
#59
freeagent
I would let it receive the full glory of 600w. If that is TGP than its only ~200w more than what my Ti is doing right now at full load.
AusWolfThen please be careful with words. "What does it matter to you" and "you made your point" sure sound hostile.

Sometimes, people come here out of some genuine interest for tech, not because they have a personal stake in the matter. Not everybody in a 5090 thread is a potential buyer. Sometimes, one is just curious. :)
I get it..
Posted on Reply
#60
AusWolf
JustBenchingNobody said anything about undervolting. I agree you shouldn't need to undervolt out of the box, but you don't need to. But power usage, why not? Nvidia, amd, intel and whatever other company that exists CANNOT by definition know how you want to use a product better than you do, therefore they cannot ship it with settings that fit your needs. Same applies to most products, be it TVs, ACs, monitors etc.

Some GPUs have a dual bios switch for noise / performance but still it's just 2 options, it can't cast a wide enough net. The same way that laptops come with turbo / silent / performance modes etc. Should I be complaining that my laptop came set to turbo out of the box? Who cares, it takes a second to change it
Like I said in another thread, I'd much rather spend less on a card that uses less power by default. Choosing the more expensive option and then limiting it by software sounds excessive to me.

But each to their own.
Posted on Reply
#61
Hecate91
AusWolfThen please be careful with words. "What does it matter to you" and "you made your point" sure sound hostile.

Sometimes, people come here out of some genuine interest for tech, not because they have a personal stake in the matter. Not everybody in a 5090 thread is a potential buyer. Sometimes, one is just curious. :)
For not caring, its sounds a lot like they do to be honest. Everyone here has their own opinions, unless these threads should be nvidia users only lol.

But the 5090 is for "gamers" as it has the Geforce branding, it definitely works as gamers with deep wallets think they need the latest flagship.
Bomby569it's is more convenient than a 4 or 5 * 8 pin connector :D
A 5090 wouldn't need more than 3 of the 8 pin connectors, 8 pin molex can handle a lot more than what it's rated for.
Posted on Reply
#62
freeagent
AusWolfLike I said in another thread, I'd much rather spend less on a card that uses less power by default. Choosing the more expensive option and then limiting it by software sounds excessive to me.

But each to their own.
But this thread is about the most ridiculous card that money can buy. It is supposed to be excessive, that is what you are paying for.. The best.
Posted on Reply
#63
AusWolf
Hecate91For not caring, its sounds a lot like they do to be honest. Everyone here has their own opinions, unless these threads should be nvidia users only lol.
Should I write "in my opinion" in front of every post I make? :confused:

I am an Nvidia user, by the way, just not in my main gaming rig at the moment. I've got two HTPCs that both have Nvidia GPUs in them. Does that make me more qualified to comment here?
Hecate91But the 5090 is for "gamers" as it has the Geforce branding, it definitely works as gamers with deep wallets think they need the latest flagship.
Let me disagree there. The 5090 has double of everything compared to the 5080 (shaders, VRAM, etc) which is already gonna be a stupidly expensive card. The 5090 is only GeForce by name to sell it to gamers. But it is not a card that your average gamer needs. Otherwise, there wouldn't be such a gigantic gap between it and the 5080 in specs.
Posted on Reply
#64
JustBenching
AusWolfLike I said in another thread, I'd much rather spend less on a card that uses less power by default. Choosing the more expensive option and then limiting it by software sounds excessive to me.

But each to their own.
So if nvidia was selling a 5070 at 200w you'd buy one, but if it's shipping it at 500 watts you wouldn't (even though you can limit it to 200w) and you'd rather buy a 5060....it doesn't make sense to me.
Posted on Reply
#65
Daven
JustBenchingAt the same game with the same settings and same fps the 5090 will likely need half the power of your 7900xt though. That's what matters when it comes to power, it's called efficiency.
And I can run an iGPU to play Tetris FTW. Some Intel fans tried to play the ISO card over and over but were ultimately ratioed out of the comments.

No one is buying a $2000 GPU to run it at the speed of a $500 GPU. There’s a reason review sites measure gaming power at max settings and resolution because that is how the product will be used by the vast majority of users.

It seems that every comment to do with Nvidia relates to poor performance (RT), blurry image quality (DLSS) and now lower settings / limiting power for even worse performance and image quality.

What the hell is going on with Nvidia users?!?!
Posted on Reply
#66
AusWolf
freeagentBut this thread is about the most ridiculous card that money can buy. It is supposed to be excessive, that is what you are paying for.. The best.
Totally. :) And that's what makes it not made for the average gamer, imo.
JustBenchingSo if nvidia was selling a 5070 at 200w you'd buy one, but if it's shipping it at 500 watts you wouldn't (even though you can limit it to 200w) and you'd rather buy a 5060....it doesn't make sense to me.
As long as I find the performance of the 5060 acceptable, yes. Spending more money on something that I'm not fully using is what doesn't make sense to me.
Posted on Reply
#67
JustBenching
DavenAnd I can run an iGPU to play Tetris FTW. Some Intel fans tried to play the ISO card over and over but were ultimately ratioed out of the comments.

No one is buying a $2000 GPU to run it at the speed of a $500 GPU. There’s a reason review sites measure power because that is how the product will be used by the vast majority of users.
An IGPU will be much slower than your 7900xt.

Of course noone is buying a 2000$ gpu to run it at the speed of a 500$ gpu, nobody argued that. I'm arguing that at the same power as your 7900xt it will be vastly faster (and it should be), so I don't get the notion of complaining about it's power.
AusWolfAs long as I find the performance of the 5060 acceptable, yes.
Well if you find the performance of the 5060 acceptable you wouldn't even be looking at the 5070 the first place regardless of the power draw. Anyways, it just doesn't make sense to me but whatever, im not the arbiter of what makes sense.
Posted on Reply
#68
AusWolf
JustBenchingWell if you find the performance of the 5060 acceptable you wouldn't even be looking at the 5070 the first place regardless of the power draw.
Why wouldn't I? What's wrong with getting a clear picture of the full market before buying something?

Don't take it personally, but this is the difference between a value-conscious buyer and a moron, imo. A value-conscious buyer looks at every option and chooses the one that is the closest to satisfying their needs at their given budget, while a moron buys the most expensive shit available without thinking about it.
Posted on Reply
#69
Daven
JustBenchingAn IGPU will be much slower than your 7900xt.

Of course noone is buying a 2000$ gpu to run it at the speed of a 500$ gpu, nobody argued that. I'm arguing that at the same power as your 7900xt it will be vastly faster (and it should be), so I don't get the notion of complaining about it's power.


Well if you find the performance of the 5060 acceptable you wouldn't even be looking at the 5070 the first place regardless of the power draw. Anyways, it just doesn't make sense to me but whatever, im not the arbiter of what makes sense.
Okay guy, buy a 5090 for $2000+ and run it at half power for the entire time you own it. I hope it’s worth it.

Btw, the 4090 is only 5% more efficient than a 7900XTX at raster and the 5090 at best will be 30% faster for almost 50% higher power than the 4090.



Posted on Reply
#70
JustBenching
DavenOkay guy, buy a 5090 for $2000+ and run it at half power for the entire time you own it. I hope it’s worth it.
Don't know if im buying one, but if I do it's going to be locked to 320w exactly like my 4090.
Posted on Reply
#71
igormp
AusWolfMore so, the 5090 won't use the fully enabled version, either. 5090 Ti coming later, perhaps? Or will it be reserved for industrial cards?
I believe it's the latter, just like what happened to the 4090 (it wasn't the fully enabled die either).
No reason for a Ti version.
RedelZaVednoWho's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
I personally just power limit my GPUs. Both my 3090s run at 275W each.
Posted on Reply
#72
gridracedriver
Good for 5080, but so expected that 5080~4090 in performance too.
Near 600watt for 5090 stock is bad...
Posted on Reply
#73
Daven
JustBenchingDon't know if im buying one, but if I do it's going to be locked to 320w exactly like my 4090.
Just buy a 5080 and save $1000+. The performance between a 5090 at 320W and a 5080 at 360W is going to be about the same. Maybe and this is a big maybe, the 5090 will be a little faster but don't forget that Nvidia is using the same node as the 4000 series. This means efficiency of the 5000 series will go down as more transistors are added.

This is a buyer beware situation and no company logo on the box beats physics.
Posted on Reply
#74
R0H1T
Icon CharlieYou know...

My Sharp II Carousel MicroWave Runs on 400 Watts...

Just saying...
But can it bend space/time & allows you to do Interstellar travel :slap:
Posted on Reply
#75
Bobaganoosh
this looks like just one more point against the claims of 5080 being faster than 4090. It's the same process node, only 65% of the cores, lower power rating, lower memory bandwidth, less memory...the only way this even competes with 4090 is going to be if there's a new DLSS tech or if they made 5000-series better at frame-gen and that's how they're compared. Raw raster power, no way.
Posted on Reply
Add your own comment
Jan 7th, 2025 01:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts