Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#26
AusWolf
JustBenchingIm not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
Ah, I get 'ya. To me, that's just Nvidia not having a clue how to extract more performance out of their GPU architecture other than cramming more parts in it and raising power. I don't know how long this can continue.
Bomby569Doom is mostly a shooter on rails, or by sections, they couldn't pull that off in a open world for example. Not to say they didn't do a great job but not all games are equal.
Why not?
Posted on Reply
#27
TheDeeGee
nguyen4090 also had a max TGP of 600W, so 5090 will probably draw the same amount of power or slightly higher. Maybe AIB 5090 will use around 520W at stock.
Can no doubt power limit to 400W with minimal FPS loss.
Posted on Reply
#29
JustBenching
cerulliberdo you remember xoc bios 1kw ?
Only dared pushing to 600w for some timespy runs. :D
Posted on Reply
#30
Random_User
JustBenchingIm not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
Or set both 4090 and 5090 at 360W, and extrapolate the performance increase difference/delta.
Also set both at 4090's max power/performance limit, and see how much further the advancement has gone. And whether there are any efficiency gains at same power usage.
Posted on Reply
#31
Why_Me
HankierosemanThat saying is older than me. Maybe to a Brit but not an American. Capitalism rules, illegally if I can get away with it. Just ask the orangeman.
You sound like you are from Austin.
Posted on Reply
#33
Wirko
cerulliberhe says over 575 W. those 12VHPWR have 600w limit and 75w from pci slot? perhaps they need two or three 12VHPWR
Can multiple 12VHPWR connectors be used in parallel? I haven't seen this discussed yet but it's becoming a necessity. Given that they're "smart" in a sense, with signaling to communicate available power etc., it might not be trivial.
Posted on Reply
#34
Macro Device
AusWolfOn AMD, TDP is total card power.
Not on RDNA2 I'm afraid. Measured my PC one day and whilst whatever OSD showed me 185ish watts on my 6700 XT the wall sensor clearly showed me that the consumption went up by about 220 watts. Can't say it's a big deal though,
the main problem of this GPU is how weak it is. Ongoing "we need more VRAM" fiesta all around and I'm clearly 3+ GB away from maxing it out but framerates already cease to be. Wouldn't have minded having about double that at 12 GB if I'm being honest with you.

Anyhow, these 360 and 575 feel predictable. What remains unclear is if there are some IPC improvements. NV got no reason to improve it but they also had no rush to discount Ada by making better price/performance Super SKUs. Ada sold fine regardless.
Posted on Reply
#35
nguyen
TheDeeGeeCan no doubt power limit to 400W with minimal FPS loss.
My 4090 run at stock clock with 110mV undervolt, resulting in stock performance at only 300-330W, if the sweetspot for 5090 is at 400W is would not be a problem for my 850W PSU
Posted on Reply
#36
Hankieroseman
Legacy-ZAI miss the days when we were a small handful of "lepers" where nerd/geek was a curse word, I don't want to share our GPUs with people that treated us badly at one point or another. :p
an accidental geek at that but a mechanical background and electrical education the hardest part was learning software. I as almost 50 before I got my first computer. When I was growing up, electronics was a portable AM radio. Now the curse word is liberal.
Posted on Reply
#37
Daven
Hankierosemanan accidental geek at that but a mechanical background and electrical education the hardest part was learning software. I as almost 50 before I got my first computer. When I was growing up, electronics was a portable AM radio. Now the curse word is liberal.
600W, $2000+, 20-30% ….hard, hard, hard pass
Posted on Reply
#38
AusWolf
Macro DeviceNot on RDNA2 I'm afraid. Measured my PC one day and whilst whatever OSD showed me 185ish watts on my 6700 XT the wall sensor clearly showed me that the consumption went up by about 220 watts. Can't say it's a big deal though,
the main problem of this GPU is how weak it is. Ongoing "we need more VRAM" fiesta all around and I'm clearly 3+ GB away from maxing it out but framerates already cease to be. Wouldn't have minded having about double that at 12 GB if I'm being honest with you.
RDNA 2 doesn't report total board power through software, only GPU chip power. Even though AMD advertises total board power as TDP, you won't see it anywhere on RDNA 2. RDNA 3 does report it, though.

So the numbers you're seeing are correct.

I agree with you on the VRAM argument. I feel like 12 GB is decent for my 6750 XT, at least I don't need to be afraid of running out of it before I run out of GPU grunt.
Posted on Reply
#39
cerulliber
WirkoCan multiple 12VHPWR connectors be used in parallel? I haven't seen this discussed yet but it's becoming a necessity. Given that they're "smart" in a sense, with signaling to communicate available power etc., it might not be trivial.
there are 2 versions of 4090 galax HOF using 2 pcie gen 5 16 pin connectors
we don't know what PNY is cooking
Posted on Reply
#40
jabbadap
Is that maximum TGP or Default. rtx 4090 has 600W for maximum and 450 as default. Well it will be revealed soon enough.

And that pcb shot looks like it's been taken from PC Partner card(Zotac et al, see Zotac rtx4090 AMP review), so probably not from nvidia reference design.
Posted on Reply
#41
hsew
AusWolfSo the 5090 has 25 W for the VRAM and other components, but the 5080 has 40 W? That doesn't compute, even with the 28/30 Gbps speed difference. The 5090 has double the VRAM chips (or density?), a much more complex PCB and a much beefier power delivery.
Power use scales exponentially with clockspeed increases, not linearly. Plus it’s likely that nVidia is going to be binning for the absolute best of the best in terms of efficiency in for the 5090 (given the fact that it is literally 2x a 5080 this is kind of mandatory) while leaving the “lesser” GPU/VRAM chips for the 5080.
Posted on Reply
#42
eldon_magi
AusWolfCapitalism might rule the business world and/or politics, but it doesn't rule me. I buy what I want/need
Loool c'mon now. It's because of capitalism that you are able to buy what you want/need.
Posted on Reply
#43
AusWolf
hsewPower use scales exponentially with clockspeed increases, not linearly. Plus it’s likely that nVidia is going to be binning for the absolute best of the best in terms of efficiency in for the 5090 (given the fact that it is literally 2x a 5080 this is kind of mandatory) while leaving the “lesser” GPU/VRAM chips for the 5080.
The 5080 is an entirely different chip. It's based on the GB203, while the 5090 is on the GB202.
eldon_magiLoool c'mon now. It's because of capitalism that you are able to buy what you want/need.
Because you couldn't buy what you wanted before capitalism was invented? C'mon now... ;)
Posted on Reply
#44
hsew
AusWolfThe 5080 is an entirely different chip. It's based on the GB203, while the 5090 is on the GB202.
Whoops.
Posted on Reply
#45
AusWolf
hsewWhoops.
More so, the 5090 won't use the fully enabled version, either. 5090 Ti coming later, perhaps? Or will it be reserved for industrial cards?
Posted on Reply
#46
RedelZaVedno
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
Posted on Reply
#47
JustBenching
RedelZaVednoWho's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
I'd buy a 2000 watt GPU, don't really care. To me it's like asking "who's gonna buy a TV that comes with 100% brightness out of the box". I don't care that much, ill just lower the brightness the same way I'll lower the power draw on the GPU.

Your AC is a good example. Didn't you change it's settings? Did you run your AC out of the box?
Posted on Reply
#48
AusWolf
RedelZaVednoWho's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
It's totally excessive and unnecessary, just like a 4090 is for most people.
Posted on Reply
#49
Daven
RedelZaVednoWho's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
The 5090 is almost double the power usage of my 7900XT which already heats up my tiny gaming room. The room would be hotter than a sauna with the 5090!
Posted on Reply
#50
Bomby569
AusWolfWhy not?
just pick any game and see the frame rate drop when you exit a building or enter a large open world area
Posted on Reply
Add your own comment
Jan 7th, 2025 02:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts