Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

173 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#1
AusWolf
So the 5090 has 25 W for the VRAM and other components, but the 5080 has 40 W? That doesn't compute, even with the 28/30 Gbps speed difference. The 5090 has double the VRAM chips (or density?), a much more complex PCB and a much beefier power delivery.
Posted on Reply
#2
londiste
Haven't the official/public TDP numbers been technically TGPs - as in whole card consumption - for a while now? For both AMD and Nvidia, the power consumption numbers measured in reviews are within measuring error of power limit that is set to the TDP. There was a point where GPU manufacturers tried to make things complicated but that did not last long.
Posted on Reply
#3
Icon Charlie
You know...

My Sharp II Carousel MicroWave Runs on 400 Watts...

Just saying...
Posted on Reply
#4
Hankieroseman
My Maytag Microwave is good for 1000 watts. My Lian-Li Edge PSU is good for 1300 watts. We will fear no GPU.
The difference between the men and the boys is the price of their toys. Saddle up kids. It's gonna be a rough ride.
Posted on Reply
#5
AusWolf
Icon CharlieYou know...

My Sharp II Carousel MicroWave Runs on 400 Watts...

Just saying...
But you don't game on your Sharp II Carousel microwave for several hours at a time. It also doesn't dump any extra heat into your PC case.

Just saying... ;)
londisteHaven't the official/public TDP numbers been technically TGPs - as in whole card consumption - for a while now? For both AMD and Nvidia, the power consumption numbers measured in reviews are within measuring error of power limit that is set to the TDP. There was a point where GPU manufacturers tried to make things complicated but that did not last long.
I don't know how it's on Nvidia now. The last card I had from them was a 2070. On that, TDP was GPU chip only power.

On AMD, TDP is total card power.
HankierosemanThe difference between the men and the boys is the price of their toys.
Oh, what a bleak, materialistic look on the world! I'm astonished.
Posted on Reply
#6
Legacy-ZA
HankierosemanMy Maytag Microwave is good for 1000 watts. My Lian-Li Edge PSU is good for 1300 watts. We will fear no GPU.
The difference between the men and the boys is the price of their toys. Saddle up kids. It's gonna be a rough ride.
I miss the days when we were a small handful of "lepers" where nerd/geek was a curse word, I don't want to share our GPUs with people that treated us badly at one point or another. :P
Posted on Reply
#7
AusWolf
Legacy-ZAI miss the days when we were a small handful of "lepers" where nerd/geek was a curse word, I don't want to share our GPUs with people that treated us badly at one point or another. :p
I miss the days when buying PC parts was cool among a few, and not just a device of some sick (penis) wallet measuring contest for the masses. :(
Posted on Reply
#8
Hankieroseman
AusWolfBut you don't game on your Sharp II Carousel microwave for several hours at a time. It also doesn't dump any extra heat into your PC case.

Just saying... ;)


I don't know how it's on Nvidia now. The last card I had from them was a 2070. On that, TDP was GPU chip only power.

On AMD, TDP is total card power.


Oh, what a bleak, materialistic look on the world! I'm astonished.
That saying is older than me. Maybe to a Brit but not an American. Capitalism rules, illegally if I can get away with it. Just ask the orangeman.
Posted on Reply
#9
nguyen
4090 also had a max TGP of 600W, so 5090 will probably draw the same amount of power or slightly higher. Maybe AIB 5090 will use around 520W at stock.
Posted on Reply
#10
AusWolf
HankierosemanThat saying is older than me. Maybe to a Brit but not an American. Capitalism rules, illegally if I can get away with it. Just ask the orangeman.
Just because it's old, it doesn't make it less stupid. Capitalism might rule the business world and/or politics, but it doesn't rule me. I buy what I want/need, not what I'm told.

What you own ends up owning you. Think about it.
Posted on Reply
#11
Hankieroseman
Now you're getting weird dude. This electric junk doesn't rule a GD thing.
Posted on Reply
#12
JustBenching
Don't care about the power draw since ill limit it to whatever power I want it to run at (takes 10 seconds) but I firmly believe that power draw should be kept steady between tiers ( meaning, 5080 should be at similar power draw to 4080 etc.). First of all, because it makes it easier to compare gen on gen, and second of all it creates a "rule" about what kind of equipment ( PSU , case) is required to run a specific tier. Flip flopping around power targets and naming is just a way to confuse the consumer about what they are actually buying. Both amd and nvidia are guilty of this.
Posted on Reply
#13
AusWolf
HankierosemanNow you're getting weird dude. This electric junk doesn't rule a GD thing.
It does if you feel obliged to buy the latest one every single time whether you need it or not. You're a slave to buying things. Sorry, but it's true.
Posted on Reply
#14
JustBenching
AusWolfIt does if you feel obliged to buy the latest one every single time whether you need it or not. You're a slave to buying things. Sorry, but it's true.
Or it's just curiosity. Like I bought a 13900k, 14900k and the 9800x 3d just because I wanted to test them / see performance improvements compared to 12th gen. It's a tech forum, people do that kind of thing around here.
Posted on Reply
#15
Legacy-ZA
AusWolfJust because it's old, it doesn't make it less stupid. Capitalism might rule the business world and/or politics, but it doesn't rule me. I buy what I want/need, not what I'm told.

What you own ends up owning you. Think about it.
Correct: I still classify myself as a customer, not a consumer. I have nothing to to prove to anyone, I buy what I "need" I live my life simplistically. I don't need new cloths every week, or the latest car, or w/e.

I just hate being taken advantage of when I do need to make these purchases when the time comes, that is what annoys me, and the realization comes, that I have to fork out money to idiotic price points because others have no self control.
Posted on Reply
#16
AusWolf
JustBenchingDon't care about the power draw since ill limit it to whatever power I want it to run at (takes 10 seconds) but I firmly believe that power draw should be kept steady between tiers ( meaning, 5080 should be at similar power draw to 4080 etc.). First of all, because it makes it easier to compare gen on gen, and second of all it creates a "rule" about what kind of equipment ( PSU , case) is required to run a specific tier. Flip flopping around power targets and naming is just a way to confuse the consumer about what they are actually buying. Both amd and nvidia are guilty of this.
I agree. I want to see speed increases due to advancements in GPU architecture, like I did in the Pascal years, and not due to cramming more parts into a chip and increasing power (what I call brute forcing).
Posted on Reply
#17
JustBenching
AusWolfI agree. I want to see speed increases due to advancements in GPU architecture, like I did in the Pascal years, and not due to cramming more parts into a chip and increasing power (what I call brute forcing).
It's also practical, im running my 4090 locked to 320w. If the review shows the 5090 being 50% faster than the 4090 while pulling 30% more power, I really have no idea how that translates to my usecase. Is it going to be 10% faster at the same 320w? Is it going to be 40%? How the hell am I supposed to wisely spend money if I don't know where the performance is coming from (brute forcing power etc.).
Posted on Reply
#18
AusWolf
JustBenchingOr it's just curiosity. Like I bought a 13900k, 14900k and the 9800x 3d just because I wanted to test them / see performance improvements compared to 12th gen. It's a tech forum, people do that kind of thing around here.
If you're genuinely interested in how things work, that's cool. I do that myself. :)

But I don't need the latest and greatest to feel good about myself. With my gaming habits, I'm fine on mid-range. I don't buy stuff just to be able to say that I have it.

Whoever says that the 4090 is not enough, and you definitely 100% need to swap it for a 5090 as soon as it's out is lying to himself.
Legacy-ZACorrect: I still classify myself as a customer, not a consumer. I have nothing to to prove to anyone, I buy what I "need" I live my life simplistically. I don't need new cloths every week, or the latest car, or w/e.

I just hate being taken advantage of when I do need to make these purchases when the time comes, that is what annoys me, and the realization comes, that I have to fork out money to idiotic price points because others have no self control.
Our whole society is built around individuals with no self-control, unfortunately. I completely agree with you, though.
Posted on Reply
#19
JustBenching
AusWolfIf you're genuinely interested in how things work, that's cool. I do that myself. :)

But I don't need the latest and greatest to feel good about myself. With my gaming habits, I'm fine on mid-range. I don't buy stuff just to be able to say that I have it.

Whoever says that the 4090 is not enough, and you definitely 100% need to swap it for a 5090 as soon as it's out is lying to himself.
I think it depends on your gaming habits honestly, for the games im playing the 4090 is an overkill, but some people that play the latest triple A unoptimized stuff at 4k and require 120 fps or something, I think even the 5090 won't be enough :D
Posted on Reply
#20
AusWolf
JustBenchingIt's also practical, im running my 4090 locked to 320w. If the review shows the 5090 being 50% faster than the 4090 while pulling 30% more power, I really have no idea how that translates to my usecase. Is it going to be 10% faster at the same 320w? Is it going to be 40%? How the hell am I supposed to wisely spend money if I don't know where the performance is coming from (brute forcing power etc.).
You can only compare stock power-to-performance ratios, and have an estimated guess. No review is gonna test any card at your individually set power level.
Posted on Reply
#21
cerulliber
he says over 575 W. those 12VHPWR have 600w limit and 75w from pci slot? perhaps they need two or three 12VHPWR
Posted on Reply
#22
Legacy-ZA
JustBenchingI think it depends on your gaming habits honestly, for the games im playing the 4090 is an overkill, but some people that play the latest triple A unoptimized stuff at 4k and require 120 fps or something, I think even the 5090 won't be enough :D
Oh how I wish developers would start optimizing their games again, just look how fantastic DOOM ran on Vulkan, it can be done, but they use these technologies that nVidia offers as a crutch and I absolutely despise them for it. :banghead:
Posted on Reply
#23
AusWolf
JustBenchingI think it depends on your gaming habits honestly, for the games im playing the 4090 is an overkill, but some people that play the latest triple A unoptimized stuff at 4k and require 120 fps or something, I think even the 5090 won't be enough :D
Need is one thing. If you have the money, go for it. But coming to a forum and boasting "mwahaha, I'll buy this thing because I'm so awesome and also 'Murica" is just plain dumb.
Posted on Reply
#24
JustBenching
AusWolfYou can only compare stock power-to-performance ratios, and have an estimated guess. No review is gonna test any card at your individually set power level.
Im not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
Posted on Reply
#25
Bomby569
Legacy-ZAOh how I wish developers would start optimizing their games again, just look how fantastic DOOM ran on Vulkan, it can be done, but they use these technologies that nVidia offers as a crutch and I absolutely despise them for it. :banghead:
Doom is mostly a shooter on rails, or by sections, they couldn't pull that off in a open world for example. Not to say they didn't do a great job but not all games are equal.
Posted on Reply
Add your own comment
Jan 5th, 2025 10:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts