Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#126
Onasi
I think that arguing about what the 5090 is “meant” to be is missing the point a bit. It’s a halo-tier consumer GPU. That’s what it is. Said consumer can use it for gaming, for work (with certain restrictions vis a vis the professional line), for mining shitcoins, for OC challenges - whatever. I don’t think that NVidia particularly cares as long as they get theirs. Whether or not said SKU makes sense for each individual users particular tasks and monetary sensibilities is rather irrelevant to the product itself. It just is. It’s the “best” card one can buy in the consumer space. It justifies its existence by this fact alone.

Think about it this way - is something like a Porsche 911 GT3 RS a fast car, the “best” 911 on offer? Yup. Is it a full on race car for professionals though? No, it’s still a street legal vehicle. Does it make sense to buy it for daily driving? Not particularly. Can an average customer with access to money that can buy one even extract full potential of one? Fuck no. Does Porsche really care what you buy it for or justifies its existence? Again, no. Because that way lies madness and one can conclude that the only people who SHOULD buy one are already incredibly good drivers to use on track days. It even might make sense, this logic, but it isn’t how the world works.
Posted on Reply
#127
SOAREVERSOR
dyonoctisIn art and design There's lots of freelancer who prefer the Geforce RTX xx90 over the workstation RTX. CAD/Scientific simulation users are pretty much the only target that I've really seen have a hard on for those professional cards, otherwise I've seen professionals says that in their domain, the ROI is just higher with a Geforce. And if a project really need more than that, they would use a cloud farm anyway (and charge the client accordingly)
For CAD/CAM you need the Quadro drivers. For freelance creators you want the RTX. Hence the creator drivers for Geforce cards that came out a good bit back.

For AI, ML, DL or CUDA Geforce is where it's at. And even at the price the X090 cards are at or the Titans before them slapping four of them in a box and going from there is cheap and economical. Some systems pack six or eight X090 cards. Freelancers, contractors, businesses, universities and more go out and buy gobs of the cards often in premade systems and racks because it's cheap.

What's justifying the existence of these cards with the specs they have isn't gaming performance just as the export bans on them aren't out of fear China will stomp people in CS2 or Overwatch. It's for their actual intended justification as compute cards.

CUDA changed everything. Geforce is not a gaming brand and has not been for a good long time. It's a compute brand. I fully expect the higher end of the range to keep running away in price and performance in the future as there is still tons more performance wanted and tons more money people are willing to spend. If you doubled the price for 25% more perf it would still sell out.
Posted on Reply
#128
igormp
SOAREVERSORFreelancers, contractors, businesses, universities and more go out and buy gobs of the cards often in premade systems and racks because it's cheap.
My university has a cluster with a nice mix of 3090s and A100s (40GB version, not the 80GB one, sadly), pretty nice whenever I need more horsepower.
Posted on Reply
#129
TheinsanegamerN
AusWolfThat's exactly what I said
That's what I think so, too. If the 5080 isn't enough, I'm not gonna spend double, and then limit my 5090 to be only a little bit faster than the 5080. It's a huge waste of money.
But what I'm saying is a power limited 5090 won't be " a little bit faster". Much like the 4090, sacrificing 5-10 percent performance the 4090 is still WAY ahead of the 4080.
AusWolfThen why do we have massive differences between GPUs such as the 5700 XT vs the Vega 64, the former of which was faster with only 62% of the cores, despite having not much of a clock speed difference?
Hmmm...not much? In what universe? The boost of rDNA 1 hits 1900 ish, compared to just 1500 for the vega. I can tell you, from owning two, that the vega did not maintain anywhere close to 1500 under load either, much closer to 1350, MAYBE 1400 with manual tuning and the fan cranked to maximum. So in the real world, you're looking at closer to a 500 MHz boost in clock rate.

Also, shader core count is not everything. Vega 64 and the 5700xt had the same ROP count. Vega 56, despite being cut down, was within margin of error of the vega 64 by comparison, suggesting vega 64 was ROP limited. Even vega 56 had issues too, the HBM was too latent to allow the GPU to fully stretch it's legs.

In short: not an arch issue, but instead multiple design issues that gimped the cards actual potential. Much like the 7900xtx, vega 64 should have been a tier higher given it's core size and power use but was held back due to design issues.
Posted on Reply
#130
AusWolf
igormpThe way you phrase things does sound aggressive and judgy nonetheless, and I believe that's what leads to many rude answers you often get as well.
The rude answers come from people who can't be bothered to read my posts fully and properly, and are just looking for a trigger instead of a starter for an educated conversation. It's sad, but hey, humans do human things, right?
igormpWell, it's meant to be used however the customer wants to. If they want to use it just as a paperweight they're free to.
Sure, don't let me stop them. :)

I'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
Posted on Reply
#131
freeagent
AusWolfI'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
Its a bit too expensive for you and me, but not for the guys who are waiting for this.
Posted on Reply
#132
AusWolf
freeagentIts a bit too expensive for you and me, but not for the guys who are waiting for this.
That's fine, but the fact that there are regular gamers waiting for this kind of explains why GPUs are so expensive these days (referring to another thread). It's because people want them anyway, so why shouldn't they be expensive, right? ;)
Posted on Reply
#133
INSTG8R
Vanguard Beta Tester
And I thought I went overboard with a 1200W PSU with my Q6600 build(I intended to go XFire at some point).
Now I’m back down to 850W
But the Kilowatt + PSU is pretty much becoming mandatory…
Posted on Reply
#134
igormp
AusWolfI'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
Phrasing it this way is perfectly nice. Calling it stupid or something akin to that is not.
AusWolfThat's fine, but the fact that there are regular gamers waiting for this kind of explains why GPUs are so expensive these days (referring to another thread). It's because people want them anyway, so why shouldn't they be expensive, right? ;)
They are a minority. Even in this forum a x090 is a minority, so that's not the main driving factor for those prices.
INSTG8RAnd I thought I went overboard with a 1200W PSU with my Q6600 build(I intended to go XFire at some point).
Now I’m back down to 850W
But the Kilowatt + PSU is pretty much becoming mandatory…
I will still be running dual GPUs with my 850W PSU until it dies :p
Posted on Reply
#135
Wirko
AusWolfI'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
I think a significant percentage of these cards are dual use. Gaming performance is just a welcome bonus (and that's how the card gets included in Steam stats).
Posted on Reply
#136
Hankieroseman
RedelZaVednoWho's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
Too late for that. I'm putting in a bigger fan. "I've got to have more power Scotty!" said Capt. Kirk. :lovetpu: y'all got me :laugh:mao
Next year I'm going for aviator oxygen and a leather jacket.
Posted on Reply
#137
Visible Noise
OnasiThink about it this way - is something like a Porsche 911 GT3 RS a fast car, the “best” 911 on offer? Yup. Is it a full on race car for professionals though? No, it’s still a street legal vehicle. Does it make sense to buy it for daily driving? Not particularly. Can an average customer with access to money that can buy one even extract full potential of one? Fuck no. Does Porsche really care what you buy it for or justifies its existence? Again, no. Because that way lies madness and one can conclude that the only people who SHOULD buy one are already incredibly good drivers to use on track days. It even might make sense, this logic, but it isn’t how the world works.
Now I want to play Forza :)
Posted on Reply
#138
x4it3n
A higher TDP was definitely predictable knowing that NVIDIA are using a pretty similar TSMC node (4nm). Now let's hope those TDPs are like on RTX 40s, almost never reaching them unless it's at 4K (or DLSS Quality) @ Ultra settings and with Ray Tracing or Path Tracing.
Posted on Reply
#139
Dawora
Daven600W, $2000+, 20-30% ….hard, hard, hard pass
20-30%?
This thing is much more faster than 20-30% vs 4090 if u meaning that?

And im sure u will pass it anyway and going to buy another AMD low/mid tier gpu
RedelZaVednoWho's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
U have very very poor AC if it overloads extra 575w
We have 4 PC all+300w and room temps not even raising.

And u know, its not using 575w when its in u PC u need to stress it 100%
Posted on Reply
#140
x4it3n
DaworaU have very very poor AC if it overloads extra 575w
We have 4 PC all+300w and room temps not even raising.

And u know, its not using 575w when its in u PC u need to stress it 100%
My 4090 can definitely make my room pretty hot when using the 530W BIOS...
Posted on Reply
#141
cerulliber
assuming you want 50% usage of psu:
9800x3d 176w
rtx 5090 575w
everything else in the case 100w
100w reserve
951w total power

2x 951w= 1902w

where's that 2kw psu? and we don't talk about sli/nvlink setup.

optimised setup rtx 5090: ?430w 75%+176wcpu+200w others = 806w x2 = 1600w psu those are in retailers shops
Posted on Reply
#142
robb
DaworaU have very very poor AC if it overloads extra 575w
We have 4 PC all+300w and room temps not even raising.

And u know, its not using 575w when its in u PC u need to stress it 100%
You are laughably full of crap claiming over 1200 watts would not impact the temperature of a room.
Posted on Reply
#143
AusWolf
robbYou are laughably full of crap claiming over 1200 watts would not impact the temperature of a room.
Well, if he dumps 1200 W into his room from his PC, then turns on the 2000 W A/C, then of course he doesn't feel a single °C difference. Only his electricity bill does. :p
WirkoI think a significant percentage of these cards are dual use. Gaming performance is just a welcome bonus (and that's how the card gets included in Steam stats).
I'm sure that's the intended purpose. But you know people... some of them just can't live without the newest, bestest, most expensivest toy (and then come here only to brag about it).
Posted on Reply
#144
razaron
My PSU recently died so I "downgraded" to a 1000W. Seeing the TDP I thought my recent purchase would put the 5090 out of reach but according to those online PSU calculators I only need 650W if I slap in a 4090, so 1000W should be enough. I have a hard time believing that though lol

I've not really touched my "luxury spending" account for over a year, so am tempted to get the 5090 if it's less than £2k...
I've been on 1600p for 13 years so am eyeing up those 4k 240hz OLEDs. Worried about that sweet burn in and Nvidia vendor locking DLSS 5 to the 6* series. Imagine spending £2k for that to happen... Will probably do some research after it comes out, second guess myself, then not spend anything for another year :laugh:
AusWolfIt doesn't. I answered a question. The hostility... Geez. :confused:

Should everybody who doesn't intend to buy a 5090 just shut the F up and leave? :confused:
Something about pots and kettles. You sort of spent the first page harrasing someone for being "a slave to buying things" followed by passive aggressive remarks about said slavery and America...
Posted on Reply
#145
CyberPomPom
cerulliberassuming you want 50% usage of psu:
9800x3d 176w
rtx 5090 575w
everything else in the case 100w
100w reserve
951w total power

2x 951w= 1902w

where's that 2kw psu? and we don't talk about sli/nvlink setup.

optimised setup rtx 5090: ?430w 75%+176wcpu+200w others = 806w x2 = 1600w psu those are in retailers shops
Wut?? The 9800X3D is closer to 60~70W in gaming loads, max a hundred.
Also why would you reserve 200W for the rest of the system? That's a lot, in gaming scenarios we should be under 40~50W for the whole rest of it.
700W total is a reasonable system power estimate.

If you want maximum efficiency the load range is 40 to 60% and you can still go a bit over without much loss.

With this in mind a 1200W PSU is already more than enough. And even a barely decent 850W would probably do the trick. The power rating of the PSU doesn't need to be divided by 3, there already is a safety margin.
Posted on Reply
#146
AusWolf
razaronSomething about pots and kettles. You sort of spent the first page harrasing someone for being "a slave to buying things" followed by passive aggressive remarks about said slavery and America...
Taking a stab back when all parties have moved on ages ago is definitely worth it. You must feel very proud for calling out on all that social injustice radiating from my horribly evil posts. :rolleyes:

Let's be clear once again: I do not have a problem with anyone who wants a 5090. Just don't shove it under my nose out of a misguided superiority/inferiority complex, will 'ya? :)

The GPU you buy does not make you better or worse. It's just an object. A piece of silicon, metals and plastic. Using it for bragging rights implies a very sad way of existence.
Posted on Reply
#147
QUANTUMPHYSICS
My next build will have no less than a 1200W PSU...just to be sure.
Posted on Reply
#148
mtosev
A 1200 watt Psu will be enough for 5090 right?
Posted on Reply
#149
Wirko
AusWolfI'm sure that's the intended purpose. But you know people... some of them just can't live without the newest, bestest, most expensivest toy (and then come here only to brag about it).
I can't say i know what's the intended purpose... but if you can run DX by night and CUDA by day, be it for work, learning, as a hobby or searching for interplanetary communication, then the GPU sure seems less overpriced.
Posted on Reply
#150
kapone32
mtosevA 1200 watt Psu will be enough for 5090 right?
Yes
Posted on Reply
Add your own comment
Jan 7th, 2025 02:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts