Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#151
mechtech
If you can afford a 5090 you can afford a new psu ;)
Posted on Reply
#152
SOAREVERSOR
WirkoI can't say i know what's the intended purpose... but if you can run DX by night and CUDA by day, be it for work, learning, as a hobby or searching for interplanetary communication, then the GPU sure seems less overpriced.
I work with these GPUs. The vast majority of them will never run a game. Ever. They are deployed for ML, DL, and AI or just straight CUDA in systems that will never see a game installed. Or they are deployed to creator systems which will also never run a game. Even the free lancers I know who have these don't game on them in their off time.

I've had Titans, and X090s at home. Don't game on them. A few odd rounds of Quake 1 maybe. I do have a Switch I game on, a PS5 I game on, a Macbook Pro I game on, and a gaming PC that only has an X080 series that I rarely game on. The consoles get the most time.

There's a narrow margin of people with the money where these things can be splurged on for gaming and e-peen. Steam hardware puts the 4090 at 1.16% which is actually pretty high compared to past prosumer cards. Which tells you that of the volume sold most of them aren't touching a game.

Bassically PC gaming is a 1080p 8gb or less VRAM affair with lots of mid range cards that do get upgrade regularly and a bunch of X080 and X070 cards that get less upgrade. The X090 cards are professional cards that people in extreme cases which are extremely rare do buy and game on, but most never game at all. CUDA is where it's for nvidia.
Posted on Reply
#153
TheoneandonlyMrK
SOAREVERSORI work with these GPUs. The vast majority of them will never run a game. Ever. They are deployed for ML, DL, and AI or just straight CUDA in systems that will never see a game installed. Or they are deployed to creator systems which will also never run a game. Even the free lancers I know who have these don't game on them in their off time.

I've had Titans, and X090s at home. Don't game on them. A few odd rounds of Quake 1 maybe. I do have a Switch I game on, a PS5 I game on, a Macbook Pro I game on, and a gaming PC that only has an X080 series that I rarely game on. The consoles get the most time.

There's a narrow margin of people with the money where these things can be splurged on for gaming and e-peen. Steam hardware puts the 4090 at 1.16% which is actually pretty high compared to past prosumer cards. Which tells you that of the volume sold most of them aren't touching a game.

Bassically PC gaming is a 1080p 8gb or less VRAM affair with lots of mid range cards that do get upgrade regularly and a bunch of X080 and X070 cards that get less upgrade. The X090 cards are professional cards that people in extreme cases which are extremely rare do buy and game on, but most never game at all. CUDA is where it's for nvidia.
You don't realise but you yourself rule out your Truth.

1.16% of all surveyed Gpu proves people are indeed buying these to game on.

Most may do Pro work but clearly plenty were bought to game on as you inadvertently pointed out?!?
Posted on Reply
#154
Bomby569
mechtechIf you can afford a 5090 you can afford a new psu ;)
shut up, puting all the money on the 5090 and cheap out on the rest is a very common tactic.
Posted on Reply
#155
cerulliber
CyberPomPomWut?? The 9800X3D is closer to 60~70W in gaming loads, max a hundred.
Also why would you reserve 200W for the rest of the system? That's a lot, in gaming scenarios we should be under 40~50W for the whole rest of it.
700W total is a reasonable system power estimate.

If you want maximum efficiency the load range is 40 to 60% and you can still go a bit over without much loss.

With this in mind a 1200W PSU is already more than enough. And even a barely decent 850W would probably do the trick. The power rating of the PSU doesn't need to be divided by 3, there already is a safety margin.
9800x3d overclocked draw 176w maximum period
you may want to reserve 100w for AIO,fans,hdds,etc. and extra 100w reserve
but of course you can optimise power consumption and use a lower psu.
Posted on Reply
#156
AusWolf
cerulliber9800x3d overclocked draw 176w maximum period
How? Do you have any review data on this?

I know it's not 100% the same, but the 7800X3D in my system will never ever consume more than 100 Watts. It does 80 at full load. 90 if I enable EXPO (I don't because I just don't give a sh** about memory performance). Even if I overclocked it, it wouldn't gain more than 20 extra Watts max.
cerulliberyou may want to reserve 100w for AIO,fans,hdds,etc. and extra 100w reserve
Fans, pumps, and HDDs consume maybe 5 W each. How many of them do you have?
Posted on Reply
#157
Ruru
S.T.A.R.S.
Funny how efficient Maxwell and Pascal were, now it's like they don't even care a bit anymore. Just MOAR power, no matter the power consumption and heat output. :facepalm:
Posted on Reply
#158
AusWolf
RuruFunny how efficient Maxwell and Pascal were, now it's like they don't even care a bit anymore. Just MOAR power, no matter the power consumption and heat output. :facepalm:
Funny that back then people went "Nvidia is so much better because it's more efficient". And now "Nvidia is so much better because... um... 4090!!!" :laugh:
Posted on Reply
#159
3valatzy
Bomby569shut up, puting all the money on the 5090 and cheap out on the rest is a very common tactic.
It needs a CPU from the future in order to not be CPU bound.
It needs a really expensive Platinum-certified 2000W PSU in order to live safely with the power spikes.

But it will still fail because of the low-quality power connector.

Solution: hell, don't buy!
Posted on Reply
#160
CyberPomPom
cerulliber9800x3d overclocked draw 176w maximum period
you may want to reserve 100w for AIO,fans,hdds,etc. and extra 100w reserve
but of course you can optimise power consumption and use a lower psu.
No, I don't know where you found this 176W reference but you are the first I met with this value. Maybe it's for the whole system, but then you wouldn't have to consider 100W more a second and a third time.

Also I don't understand what is this 100W reserve? What is it's purpose? To account for what?
Because we already took into account the whole system, adding a further tolerance would only decrease the PSU efficiency since it would leave it's optimal range.
Posted on Reply
#161
Wirko
TheoneandonlyMrK1.16% of all surveyed Gpu proves people are indeed buying these to game on.
Not directly related to this argument but: 1.16% by number of cards would be at least 5% by money spent.
Posted on Reply
#162
Visible Noise
robbYou are laughably full of crap claiming over 1200 watts would not impact the temperature of a room.
He just conveniently left out that’s with the windows open.
RuruFunny how efficient Maxwell and Pascal were, now it's like they don't even care a bit anymore. Just MOAR power, no matter the power consumption and heat output. :facepalm:
Ummm, you know efficiency contains a minimum of two factors, right? You appear to have left one out.

include the performance part of the efficiency equation and it turns out Ada is the the most efficient GPU architecture ever.

Citation:
www.techpowerup.com/review/msi-geforce-rtx-4090-gaming-x-trio/38.html
AusWolfFunny that back then people went "Nvidia is so much better because it's more efficient". And now "Nvidia is so much better because... um... 4090!!!" :laugh:
See the chart linked above.
Posted on Reply
#163
JosefHrib
cerulliberassuming you want 50% usage of psu:
9800x3d 176w
rtx 5090 575w
everything else in the case 100w
100w reserve
951w total power

2x 951w= 1902w

where's that 2kw psu? and we don't talk about sli/nvlink setup.

optimised setup rtx 5090: ?430w 75%+176wcpu+200w others = 806w x2 = 1600w psu those are in retailers shops
I own Seasonic Prime TX ATX3 1600W and only my cpu have over 350W (cpu peak is much higher) Intel Xeon Emerald Rapids 64cores 128threads 320MB L3, so RTX5090 with another 600W is not problem. I will be buying RTX5090. :)

In my previous build I had 56cores 112threads cpu with ASUS ROG RTX4090OC.

With this build with 64cores cpu I wait for new nvidia cards. Back to 512bit bus after many many years. :)
Posted on Reply
#164
TheinsanegamerN
AusWolfFunny that back then people went "Nvidia is so much better because it's more efficient". And now "Nvidia is so much better because... um... 4090!!!" :laugh:
Maxwell left performance on the table since it was originally a 20 NM design that had to be backported to 28 when TSMC cancelled it last minute. Pascal was limited by clocks and died size. The 1080ti was pretty sizeable for 16nn, and the promised OC speeds never materialized.

And yet, go aback to the 1080ti reviews and there are people there whining about pascals power use especially at higher core speeds and how Nvidia is throwing power out the window.

When it comes to Watts per frame, yeah Nvidia is by and far in the lead. And for the record, it wasn't just efficiency, with Maxwell and pascal, it was efficiency, way better drivers, and actual performance while AMD was busy sucking on GCN 1 for the third time or not bothering to go over the 1060s performance. AMD genuinely sucked mid decade.
3valatzyIt needs a CPU from the future in order to not be CPU bound.
We're already gpu bound at 4k and even 1440p today with the 4090. Especially with RT
3valatzyIt needs a really expensive Platinum-certified 2000W PSU in order to live safely with the power spikes.
No you don't. This is FUD. the vast majority of older quality supplies can handle spikes without issue, and Ada's spikes are nowhere near as bad as amperes.

Platinum pays are not even that expensive anymore. The titanium's are the wallet breakers.
3valatzyBut it will still fail because of the low-quality power connector.

Solution: hell, don't buy!
Something tells me you were never in the market for a 5090 in the first place.
Posted on Reply
#165
cerulliber
7200rpm hdd consumption = peak 20-25w for startup. some have none, some have 3 etc.
d5 next pump = 25w
fans might vary - some have 3, some have 12 etc.
also you will have an much lower average power consumption,imho what matter when choosing a psu is peak power consumption
50% usage of psu-is this a myth or unnecessary biased of olden era?
btw, seems you are against of a cpu of 176w but have nothing about a gpu of 575w :):p
Posted on Reply
#166
AusWolf
Visible NoiseHe just conveniently left out that’s with the windows open.



Ummm, you know efficiency contains a minimum of two factors, right? You appear to have left one out.

include the performance part of the efficiency equation and it turns out Ada is the the most efficient GPU architecture ever.

Citation:
www.techpowerup.com/review/msi-geforce-rtx-4090-gaming-x-trio/38.html



See the chart linked above.
I don't see a single Pascal card on that chart, but fair enough.
cerulliber7200rpm hdd consumption = peak 20-25w for startup. some have none, some have 3 etc.
d5 next pump = 25w
fans might vary - some have 3, some have 12 etc.
also you will have an much lower average power consumption,imho what matter when choosing a psu is peak power consumption
50% usage of psu-is this a myth or unnecessary biased of olden era?
btw, seems you are against of a cpu of 176w but have nothing about a gpu of 575w :):p
I stand corrected on the 9800X3D. It seems it's really not as efficient as the 7800X3D made me think.

But 25 W on a HDD at startup? Nah, I call bullshit on that. Do you have a source?
Posted on Reply
#167
BSim500
cerulliber7200rpm hdd consumption = peak 20-25w for startup. some have none, some have 3 etc.
d5 next pump = 25w
fans might vary - some have 3, some have 12 etc.
also you will have an much lower average power consumption,imho what matter when choosing a psu is peak power consumption
50% usage of psu-is this a myth or unnecessary biased of olden era?
50% PSU load is a half-truth. It's typically the peak of the bell-shaped efficiency curve, eg, 30% load = 89%, 40% load = 90%, 50% load = 91%, 60% load = 90%, 70% load = 89%, etc. It's a good average target to aim for to allow headroom for transient spikes. With a decent branded A-tier PSU with decent 105c Japanese caps though, you are certainly "allowed" to load a PSU 60%, 70% load, and you definitely don't need to buy 2000w PSU's to aim for a 25% average load to force transient spikes to fit under a PSU's 50% "limit", ie, you don't need to "double count" headroom for spikes. 176w for a 9800x3d Ryzen seems very high compared to most other owners who aren't going to shove 1.5v through it before putting in under liquid nitrogen. Likewise, "100w other stuff in the case" seems way too much with 10-25w being more normal for a system with 1x NVMe SSD + fans. Yes, 7200rpm 3.5" HDD's can surge 15-20w or so each whilst spinning up from a standstill before dropping back to typical 5-10w, but that happens only for the first 2-3 seconds immediately after power on whilst the CPU & GPU are (mostly) idle, so even if you have 10x of those (most people don't even have one anymore), they can just "borrow" from the (idling) GPU budget during boot-up without needed to be over-counted as if their "surging spin-up" is constant on top of a maxed out GPU.
Posted on Reply
#168
AusWolf
BSim50050% PSU load is a half-truth. It's typically the peak of the bell-shaped efficiency curve, eg, 30% load = 89%, 40% load = 90%, 50% load = 91%, 60% load = 90%, 70% load = 89%, etc. It's a good average target to aim for to allow headroom for transient spikes. With a decent branded A-tier PSU with decent 105c Japanese caps though, you are certainly "allowed" to load a PSU 60%, 70% load, and you definitely don't need to buy 2000w PSU's to aim for a 25% average load to force transient spikes to fit under a PSU's 50% "limit", ie, you don't need to "double count" headroom for spikes.
A theoretical question (to everyone): what is the difference between 91% and 89% efficiency in real power values? Peanuts, I'd say. So planning for 50% PSU load just for efficiency's sake is pointless. You're spending way more money on an oversized unit than you're saving on the extra 1-2% efficiency. Penny wise, pound stupid, as they say.
Posted on Reply
#169
cerulliber
BSim500176w for a 9800x3d Ryzen seems very high compared to most other owners who aren't going to shove 1.5v through it before putting in under liquid nitrogen.
It was tested with 360 aio. overclocking was done with 1.24v
no offence, but some of us seems very biased. If you don't agree with 176w for 9800x3d you will have 176w for 9950x3d anyway
so what psu one does really needs for 5090 575w at 75%?
Posted on Reply
#170
BSim500
AusWolfA theoretical question (to everyone): what is the difference between 91% and 89% efficiency in real power values? Peanuts, I'd say. So planning for 50% PSU load just for efficiency's sake is pointless. You're spending way more money on an oversized unit than you're saving on the extra 1-2% efficiency. Penny wise, pound stupid, as they say.
As I said, people don't aim for 50% just for efficiency, but also to allow for transient spikes as well as the option to upgrade to the next GPU tier without needing to change PSU's later on.
cerulliberso what psu one does really needs for 5090 575w at 75%? no offence, but some of us seems very biased.
I'll tell you when it gets released and reviewed. As with previous flagship GPU's, I can see existing 1000-1200w PSU owners having no problems at all without needing a 2000w PSU upgrade. Also not sure where you got the "bias" from as this is my first post to the thread. I'm not in the market for a "flagship" GPU myself but I also see little value in arguing in pre-release threads vs "just wait and see" then base your build over actual measurements...
Posted on Reply
#171
Duskfall
JustBenchingIm not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
Buy it , test it and if you don’t like it you have 14 days to return it anyways in most countries
Posted on Reply
#172
AusWolf
BSim500As I said, people don't aim for 50% just for efficiency, but also to allow for transient spikes as well as the option to upgrade to the next GPU tier without needing to change PSU's later on.
Transient spikes are fine... but I don't think the next generation GPU should consume so considerably more that you'd need a stronger PSU for it.

Again, people voting with their asses instead of brains/wallets. "It's the fastest crap in the world, so I must have it" - bah. (I'm not talking about people doing professional work on their GPU, of course)
Posted on Reply
#173
BSim500
AusWolfTransient spikes are fine... but I don't think the next generation GPU should consume so considerably more that you'd need a stronger PSU for it.

Again, people voting with their asses instead of brains/wallets. "It's the fastest crap in the world, so I must have it" - bah. (I'm not talking about people doing professional work on their GPU, of course)
Oh, I agree. I'm not sure if I was wording things correctly but that was my point in reply to cerulliber - that people with 1000-1200w PSU's realistically won't need to upgrade to 2000w on the basis of padding it out a bit with "what if you had 100 watts worth of case fans and 7,200rpm HDD's". I wouldn't even want to sit in the same room as that rig for acoustic sanity reasons alone... :D
Posted on Reply
#174
Dawora
robbYou are laughably full of crap claiming over 1200 watts would not impact the temperature of a room.
Great AC keeps room temp just the same.
So impact is 0c in room temps because of that.

U dont know what AC is,right?
its not a Hevy band..
RuruFunny how efficient Maxwell and Pascal were, now it's like they don't even care a bit anymore. Just MOAR power, no matter the power consumption and heat output. :facepalm:
U should know Energy Efficiency and power usage is 2 dif things.

Maxwell and pascal is just bad..
4000 series is most energy efficient gpus there is Atm

Power usage and performance =Energy Efficiency

If GPU use +300w or more that dosent mean it cant be Energy Efficient

Rtx4090 +350w gpu is more Efficient than Rtx3050


Posted on Reply
#175
3valatzy
AusWolfAgain, people voting with their asses instead of brains/wallets. "It's the fastest crap in the world, so I must have it" - bah. (I'm not talking about people doing professional work on their GPU, of course)
Good luck paying £4000 for this "fastest".
DaworaU should know Energy Efficiency and power usage is 2 dif things.

Maxwell and pascal is just bad..
4000 series is most energy efficient gpus there is Atm

Power usage and performance =Energy Efficiency

If GPU use +300w or more that dosent mean it cant be Energy Efficient

Rtx4090 +350w gpu is more Efficient than Rtx3050

Does this graph include undervolted cards or simply stock? If stock, it is not representative.
Posted on Reply
Add your own comment
Jan 7th, 2025 02:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts