Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#101
AusWolf
igormpHaven't you noticed how overclocking has diminished in popularity lately? Manufacturers are pushing components with higher clocks (and power, as a consequence) out of the box to try and get an edge and bigger numbers for marketing reasons.
Except that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.
igormpConsumers have already shown they don't care about sane power consumption, they want that extra performance out of the box. Just look at what happened to the 9000 series from AMD, where they had to push for a bios with a higher default TDP to appease their consumers. Or Intel, where most people didn't give a damn about the great efficiency vs the previous gen.
I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Posted on Reply
#102
SOAREVERSOR
AusWolfThat's awesome! :) Sometimes it's nice to be proven wrong. :ohwell:

It begs the question though, why the 4090 has to be a 450 W card by default if it doesn't bring any extra performance to the table. What is Nvidia aiming at with such a high power consumption?
The actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
Posted on Reply
#103
igormp
AusWolfExcept that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.
People are really willing to go through great lengths for that extra 2~5% performance, it is what it is.
AusWolfI see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Eh, I do have opinions on that, but I guess it'd be pretty off-topic.
SOAREVERSORThe actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
Most prosumers will actually power limit those GPUs. Just look at the TDP of their enterprise lineup, it's way lower than their geforce counterparts since perf/watt is an important metric in that space.
Posted on Reply
#104
TheoneandonlyMrK
igormpSo that for some very specific cases it can stretch its legs all the way, using an extra 100W for a 100MHz bump for those synthetic benchmark scores.
Same goes for the 600W limit some models have, really pushing the power envelope for minor clock gains. Reminder that after some point, the performance scaling x power becomes exponential.

Both my 3090s have a default power limit of 370W, whereas at 275W I loose less than 10% perf.

Here's a simple example of power scaling for some AI workloads on a 3090, you can see that after some point you barely get any extra performance when increasing power:

benchmarks.andromeda.computer/videos/3090-power-limit?suite=language

That has been the case since... always. Here's another example with a 2080ti:

timdettmers.com/2023/01/30/which-gpu-for-deep-learning/#Power_Limiting_An_Elegant_Solution_to_Solve_the_Power_Problem

Games often don't really push a GPU that hard, so the consumption while playing is really lower than the actual limit.
Biscuits, path tracing etc etc IE the whole point of buying too much GPU ATM pushes a GPU just fine or do people buy these over the top priced cards just for raytraced Fortnite and Roblox.

Games now often Do push high load's.
Posted on Reply
#105
cerulliber
JustBenchingBut i'd also run the 5080 at 320w, and so the performance difference will still be whatever it ends up being.
problem is 5080 vram is bad over 2k already on Star Wars Outlaws with secret settings outlaws.
idk what games will use over 16gb vram maxed out or what insane textures can fit 32gb vram. witcher4 and gta6 might be some of it. no words on ac shadows and hexe.
and no, 5080 underpowered is not same as 5090. cause under hood is a totally different card. this is why overclocking won't make a huge difference excepting benchmarks.
and seriously,anyone buying 5090 should watercooled it in summers which become more and more hotter, way over 2 degree climate target
Posted on Reply
#106
R0H1T
SOAREVERSORThe actual workloads it's intended for.
That's debatable, the workloads you're talking about would still be VRAM limited. Unless you meant selling them in China, bypassing some of the usual sanctions?
SOAREVERSORThe x090 series cards are spec'd and priced for those.
They really aren't, the xx90 are neither here nor there IMO.
Posted on Reply
#107
AusWolf
cerulliberproblem is 5080 vram is bad over 2k already on Star Wars Outlaws with secret settings outlaws.
idk what games will use over 16gb vram maxed out or what insane textures can fit 32gb vram. witcher4 and gta6 might be some of it. no words on ac shadows and hexe.
As long as the latest gen consoles have 16 GB RAM (not VRAM - total system RAM also used as VRAM), I'd be cautious about making predictions on this front.
Posted on Reply
#108
Legacy-ZA
AusWolfExcept that it doesn't bring bigger numbers. 71 vs 73 FPS is not a difference in my books. It's just margin of error. This is why overclocking is dead. Everything is overclocked out-of-the-box for no reason.


I see that, but I still find it weird, especially with all the propaganda about going green by using less power and stuff. Where is all that greenness in the home PC industry? You're constantly being reminded to use LED lights in your home and turn them off to save 5 Watts per hour only so that your high-end GPU can waste another 100 W on nothing? :confused:
Yes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
Posted on Reply
#109
AusWolf
Legacy-ZAYes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
Not to mention made-by-AMD and Nvidia FE cards look so much better than all the plastic bling-bling gamery shit made by board partners! :rolleyes:
Posted on Reply
#110
JustBenching
Legacy-ZAYes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
I wish better models had 2% more performance :D

They don't, just look at their clockspeeds, they are all within 0.01% of each other. Higher end models just have better PCB power delivery etc.
AusWolfNot to mention made-by-AMD and Nvidia FE cards look so much better than all the plastic bling-bling gamery shit made by board partners! :rolleyes:
FE was nice when it was used on smaller models (got a 3060ti FE, it's really nice), but look at the 4090 FE, it's ugly as hell. There are a few designs that still look decent, usually it's the high end that's bling bling gamerz OC RGB, base models are nice, check the 4090 windforce for example.
Posted on Reply
#111
igormp
TheoneandonlyMrKBiscuits, path tracing etc etc IE the whole point of buying too much GPU ATM pushes a GPU just fine or do people buy these over the top priced cards just for raytraced Fortnite and Roblox.

Games now often Do push high load's.
I don't think that's the case, just take a look at steam's hw survey and see how the 3090 and 4090 fare in the list, way below the other products.
Most gamers won't be buying those products. Reminder that this forum is a niche with some enthusiasts that can afford it, but your average buyer won't even consider that product as an option.
R0H1TThat's debatable, the workloads you're talking about would still be VRAM limited.
And that's why you buy multiple of those :p
Posted on Reply
#112
AusWolf
igormpI don't think that's the case, just take a look at steam's hw survey and see how the 3090 and 4090 fare in the list, way below the other products.
Most gamers won't be buying those products. Reminder that this forum is a niche with some enthusiasts that can afford it, but your average buyer won't even consider that product as an option.
x90 is a niche even on this forum, I'd dare to say. It's for professionals, 4K high refresh gamers.
Posted on Reply
#113
GodisanAtheist
Undervolting is the new overclocking. You get a card juiced way beyond the v/f curve out of the box, and the game is now to keep 95% of the performance for 70% of the power.

It's honestly for the best this way. Overclocking was fun (especially when you had to draw your own traces), but most buyers are fundamentally getting ripped off by not getting all of the performance out of the card they paid for.

Now you get all the performance, and it's on you to find the power/heat level that you're OK with.
Posted on Reply
#114
AusWolf
GodisanAtheistIt's honestly for the best this way. Overclocking was fun (especially when you had to draw your own traces), but most buyers are fundamentally getting ripped off by not getting all of the performance out of the card they paid for.
I kind of agree and disagree. I like using my card to its full potential, but 1. I'm not keen on the top 5% at all costs, and 2. we're constantly being fed that the planet needs to be saved, we're using too much energy and whatnot, so then why do GPUs have to consume hundreds of Watts more out-of-the-box for the extra 5% performance? Is it really worth it? I'm not sure it is.
Posted on Reply
#115
dyonoctis
R0H1TThat's debatable, the workloads you're talking about would still be VRAM limited. Unless you meant selling them in China, bypassing some of the usual sanctions?

They really aren't, the xx90 are neither here nor there IMO.
In art and design There's lots of freelancer who prefer the Geforce RTX xx90 over the workstation RTX. CAD/Scientific simulation users are pretty much the only target that I've really seen have a hard on for those professional cards, otherwise I've seen professionals says that in their domain, the ROI is just higher with a Geforce. And if a project really need more than that, they would use a cloud farm anyway (and charge the client accordingly)
Posted on Reply
#117
GodisanAtheist
AusWolfI kind of agree and disagree. I like using my card to its full potential, but 1. I'm not keen on the top 5% at all costs, and 2. we're constantly being fed that the planet needs to be saved, we're using too much energy and whatnot, so then why do GPUs have to consume hundreds of Watts more out-of-the-box for the extra 5% performance? Is it really worth it? I'm not sure it is.
- Could just go with the accelerationism approach and try to burn down the planet faster... better a quick death than a long and drawn out battle against the inevitable :rockout:

But that's the beauty of undervolting... you don't have to get that last 5% at all costs, you can reduce power consumption and your own carbon footprint. The choice has simply shifted from having to gain performance to having to conserve power and heat, other side of the same coin.

I can't speak to newer Nvidia cards, but AMD definitely has some one click and done settings in their software that will bias the card one way or another so it doesn't even require a whole to of tweaking and noodling to get decent results.
Posted on Reply
#118
AusWolf
GodisanAtheist- Could just go with the accelerationism approach and try to burn down the planet faster... better a quick death than a long and drawn out battle against the inevitable :rockout:
Interesting thought. I wouldn't say I entirely disagree, but let's not go there for the sake of other forum members' sanity. :D
GodisanAtheistBut that's the beauty of undervolting... you don't have to get that last 5% at all costs, you can reduce power consumption and your own carbon footprint. The choice has simply shifted from having to gain performance to having to conserve power and heat, other side of the same coin.
Sure, but how many people do that? I'd bet at least 9 out of 10 people just plug their cards in and let it run full blast. What we see here in the forum is a tiny minority.
GodisanAtheistI can't speak to newer Nvidia cards, but AMD definitely has some one click and done settings in their software that will bias the card one way or another so it doesn't even require a whole to of tweaking and noodling to get decent results.
It's not that great. The last time I tried the "auto undervolt" button (if that's what you mean) when I briefly had a 7800 XT on my hands, it shaved maybe 10 W off of it. The power slider works nicely, though.

Does Nvidia have an equivalent function in their new software? I haven't tried it, yet (my HTPCs are running old drivers at the moment).
Posted on Reply
#119
Visible Noise
igormpYou seem to be under the assumption that performance scales linearly with power.
A 5090 at a lower power budget than a 5080 is still going to have almost double the memory bandwidth, and way more cores, even if those are clocked lower.

Your assumptions are also wrong. A 5090 at 320W is likely to only be 10~20% slower than the stock setting.
The 5080 math is also not that simple because things (sadly) often do not scale linearly like that.
Example - my 4090 at a 90% power limit loses 2% performance.
AusWolfand people with more money than brain.
That‘s where I have a problem with your thinking. Who are you to judge how I spend my money, what my values are, and what I derive pleasure from?

Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.

Just as an FYI I upgraded from a GTX 960. The reason I can afford a 4090 is because I saved the money by not upgrading constantly.
Posted on Reply
#120
3valatzy
SOAREVERSORThe actual workloads it's intended for. The x090 series cards are spec'd and priced for those. They aren't gaming products.
Nvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
Posted on Reply
#121
AusWolf
Visible NoiseThat‘s where I have a problem with your thinking. Who are you to judge how I spend my money, what my values are, and what I derive pleasure from?

Seriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.

Just as an FYI I upgraded from a GTX 960. The reason I can afford a 4090 is because I saved the money by not upgrading constantly.
I listed several (even legitimate) use cases for the 4090. Did I say which kind of buyer you personally were? ;)

Also, how do you define an "enthusiast"?
3valatzyNvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
There's no more Quadro, just RTX Axxx. And who said they're not good at gaming?
3valatzyThey are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
That's pure marketing, not the full picture.
Posted on Reply
#122
ThomasK
Visible NoiseSeriously, just fuck off with this bullshit attitude. I thought this was an enthusiast website.
Judging by your own comments here, I can safely say that it is not.
Posted on Reply
#124
AusWolf
3valatzyMe. You know it, but... ?



buildapc/comments/167yyqu
Like the TLDR says there - they can be used for gaming.

I still don't think a x90 card is meant for gaming purely because of the insane specs and price.
Posted on Reply
#125
igormp
3valatzyNvidia has other products for non-gaming purposes. They are called Quadro, Tesla and the like.
They are very bad for gaming, while the x090 are the highest performant for gaming. They are gaming cards.
Are you aware that TONS of people buy those for productivity, and not for games? Way better price than their enterprise lineup, gets the job done and has no restrictions when meant to be placed in a DIY setup. I'm one of such users:
AusWolfI listed several (even legitimate) use cases for the 4090. Did I say which kind of buyer you personally were? ;)
The way you phrase things does sound aggressive and judgy nonetheless, and I believe that's what leads to many rude answers you often get as well.
3valatzyMe. You know it, but... ?
They often have lower clocks and power limits (for better power efficiency), but will work just fine for games.
Most of that argument you got off reddit is based on price to performance, which is valid, but doesn't meant the product won't be able to run games (and even do it well!)
AusWolfI still don't think a x90 card is meant for gaming purely because of the insane specs and price.
Well, it's meant to be used however the customer wants to. If they want to use it just as a paperweight they're free to.
Posted on Reply
Add your own comment
Jan 7th, 2025 02:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts