Friday, January 3rd 2025

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.

For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources: hongxing2020 and kopite7kimi, via VideoCardz
Add your own comment

208 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

#176
Adrian Andrzejewski
AusWolfBut you don't game on your Sharp II Carousel microwave for several hours at a time. It also doesn't dump any extra heat into your PC case.

Just saying... ;)
You can probably play Doom on it.
Posted on Reply
#177
jesdals
HankierosemanMy Maytag Microwave is good for 1000 watts. My Lian-Li Edge PSU is good for 1300 watts. We will fear no GPU.
The difference between the men and the boys is the price of their toys. Saddle up kids. It's gonna be a rough ride.
But will you be able to run both before your fuses melt in the box!
Posted on Reply
#178
AusWolf
DaworaGreat AC keeps room temp just the same.
So impact is 0c in room temps because of that.

U dont know what AC is,right?
its not a Hevy band..
Dump 1000+ W from your PC into your room, then turn on your 2000 W AC to dissipate it. Way to go, champ! :rockout:
Posted on Reply
#179
cerulliber
BSim500As I said, people don't aim for 50% just for efficiency, but also to allow for transient spikes as well as the option to upgrade to the next GPU tier without needing to change PSU's later on.


I'll tell you when it gets released and reviewed. As with previous flagship GPU's, I can see existing 1000-1200w PSU owners having no problems at all without needing a 2000w PSU upgrade. Also not sure where you got the "bias" from as this is my first post to the thread. I'm not in the market for a "flagship" GPU myself but I also see little value in arguing in pre-release threads vs "just wait and see" then base your build over actual measurements...
of course they will need to upgrade, because of 5090 ti and whatever special cable 6090 ti will require /s
Posted on Reply
#180
resulnaki99
Legacy-ZAYes, why I also no longer buy OC models, unless they come with Dual BIOS, or better VRM/Phases etc. Makes no sense to pay more for what, a name and maybe 2% more performance? lol
I disagree with some of the comments here. I just signed up to mention this. I have the suprim x 4090 (already OC), and the extra OC (%20 on clock, %15 on mem) made a difference with minimal tbp increase. I get about 10 fps extra with only 450 to 480 Max tbp. I run it at 120% Max power limit. When you play at 4k, that's when the 4090 shines and power difference comes in to play. Anything less than 4k, 4090 is literally useless. CP2077, Jedi Survivor, Plage Tale Requiem are all power hungry, yet max tbp I've seen is 480w spikes, but 400 to 420 average.
Posted on Reply
#181
AusWolf
resulnaki99I disagree with some of the comments here. I just signed up to mention this. I have the suprim x 4090 (already OC), and the extra OC (%20 on clock, %15 on mem) made a difference with minimal tbp increase. I get about 10 fps extra with only 450 to 480 Max tbp. I run it at 120% Max power limit. When you play at 4k, that's when the 4090 shines and power difference comes in to play. Anything less than 4k, 4090 is literally useless. CP2077, Jedi Survivor, Plage Tale Requiem are all power hungry, yet max tbp I've seen is 480w spikes, but 400 to 420 average.
10 extra FPS on a 4090 at 120% power? That sounds like a total waste to me. No offense.

On a lower class card, the differences are even smaller.
Posted on Reply
#182
SIGSEGV
AusWolfFunny that back then people went "Nvidia is so much better because it's more efficient". And now "Nvidia is so much better because... um... 4090!!!" :laugh:
The people now really want AMD to seriously compete with Nvidia :roll:
IMO, XX90 is meant for gaming but buying this card aimed at gaming is beyond clueless.
Posted on Reply
#183
AusWolf
SIGSEGVThe people now really want AMD to seriously compete with Nvidia :roll:
IMO, XX90 is meant for gaming but buying this card aimed at gaming is beyond clueless.
Personally, I just want a decent mid-range card at a decent price. I remember when owning x90 used to be a normal, everyday thing for normal, everyday monitor resolutions, but those days are long gone. It's gone way overboard in both performance and price for me to be interested.
Posted on Reply
#184
Dawora
AusWolfDump 1000+ W from your PC into your room, then turn on your 2000 W AC to dissipate it. Way to go, champ! :rockout:
i have new house build 2024, i can change all room temps individualy how i like.
Good system, in winter or summer

Winter times cost lot more than summer times, but i realy dont care anymore bc i sold lot of those Xlm/Xrp +500% profit.
So i can buy 100x 5090 atm if price is 2490e/each
AusWolf10 extra FPS on a 4090 at 120% power? That sounds like a total waste to me. No offense.

On a lower class card, the differences are even smaller.
if u know anything about OC
its 120% powerlimit slider.
it gives more room for OC
Posted on Reply
#185
x4it3n
mechtechIf you can afford a 5090 you can afford a new psu ;)
Agree! I bought a Seasonic PRIME TX-1600 Noctua Edition for the occasion! Lol.
Also if the 5090 really has a 575W TDP then you can be sure that AIB OC models will get 2x 12V-2x6 connectors with a ~700W BIOS (or more for Overclocking).
Posted on Reply
#186
nguyen
x4it3nAgree! I bought a Seasonic PRIME TX-1600 Noctua Edition for the occasion! Lol.
Also if the 5090 really has a 575W TDP then you can be sure that AIB OC models will get 2x 12V-2x6 connectors with a ~700W BIOS (or more for Overclocking).
There are 900W XOC Bios for 4090, my bet is those people people with burnt connector flashed their 4090 with XOC Bios.

Stock BIOS my TUF 4090 goes up to 600W already.
Posted on Reply
#187
x4it3n
cerulliberof course they will need to upgrade, because of 5090 ti and whatever special cable 6090 ti will require /s
What is funny is that Nvidia pushed to get that 12VHPWR/16-pin out so people only have 1 cable, but here we are with a GPU of 575W (almost the max 600W) and we're going to need 2 connectors & cables soon...
nguyenThere are 900W XOC Bios for 4090, my bet is those people people with burnt connector flashed their 4090 with XOC Bios.

Stock BIOS my TUF 4090 goes up to 600W already.
I would never install a non original BIOS on my GPU but my 4090 SUPRIM LIQUD X has a 530W max TDP yeah. I know FE and ASUS have 600W but I didn't want that because of all the melting connectors. I bought a TX-1600 Noctua to get 2x 12V-2x6 (ATX 3.1 & PCIe 5.1) connectors & cables and get ready for the 5090.
Posted on Reply
#188
AusWolf
x4it3nWhat is funny is that Nvidia pushed to get that 12VHPWR/16-pin out so people only have 1 cable, but here we are with a GPU of 575W (almost the max 600W) and we're going to need 2 connectors & cables soon...
Maybe they pushed so that people can get 1200 W through 2 cables instead of 300?
Posted on Reply
#189
x4it3n
AusWolfMaybe they pushed so that people can get 1200 W through 2 cables instead of 300?
The previous 8-pin were rated for 150W but could actually sustain 300W on each cable. Whereas the 12VHPWR is rated for 600W but its max power draw before melting is 630W... definitely not as safe as the old 8-pin certification was for sure.
1200W for a GPU would be nonsense though. I hope we never get there! It's going to create so much heat...
Posted on Reply
#190
AusWolf
x4it3nThe previous 8-pin were rated for 150W but could actually sustain 300W on each cable. Whereas the 12VHPWR is rated for 600W but its max power draw before melting is 630W... definitely not as safe as the old 8-pin certification was for sure.
1200W for a GPU would be nonsense though. I hope we never get there! It's going to create so much heat...
I thought even 600 W was nonsense for a GPU and look where we are now (I still think it's nonsense, though).
Posted on Reply
#191
x4it3n
AusWolfI thought even 600 W was nonsense for a GPU and look where we are now (I still think it's nonsense, though).
Yeah agree, I remember when High-end GPUs had a 250W TDP... and now it's 575W. Just like Intel and their 13/14th CPUs that can reach ~300W... This world is going nuts!
Posted on Reply
#192
cerulliber
AusWolfMaybe they pushed so that people can get 1200 W through 2 cables instead of 300?
my (game) theory is that latest hedt seasonic psu have 2 12whpr to run 5090 in sli/nvlink. it make a lot of sense for r/localllama and a few other niche communities such as fluxai, aivideos .
there is basically no vram usage limit when using big models, ai image & video generation and those dudes run a lot of gpus in same rig
Posted on Reply
#193
AusWolf
cerullibermy (game) theory is that latest hedt seasonic psu have 2 12whpr to run 5090 in sli/nvlink. it make a lot of sense for r/localllama and a few other niche communities such as fluxai, aivideos .
there is basically no vram usage limit when using big models, ai image & video generation and those dudes run a lot of gpus in same rig
It looks like you'll need both of those connectors for a 5090, at least with some models. Nvlink will need a second PSU (or one with 4 connectors). Even just the thought makes me shiver.
Posted on Reply
#196
x4it3n
AusWolfIt looks like you'll need both of those connectors for a 5090, at least with some models. Nvlink will need a second PSU (or one with 4 connectors). Even just the thought makes me shiver.
4 connectors would be insane... even the Seasonic PX ATX 3 (2200W) only has 2x 12V-2X6 connectors!

seasonic.com/atx3-prime-px-2200/
Posted on Reply
#197
AleXXX666
AusWolfSo the 5090 has 25 W for the VRAM and other components, but the 5080 has 40 W? That doesn't compute, even with the 28/30 Gbps speed difference. The 5090 has double the VRAM chips (or density?), a much more complex PCB and a much beefier power delivery.
IDC, I just could say that 360 W is "nearly" "acceptable" TDP, but 575 W is a nonsense.
Posted on Reply
#198
igormp
cerullibermy (game) theory is that latest hedt seasonic psu have 2 12whpr to run 5090 in sli/nvlink. it make a lot of sense for r/localllama and a few other niche communities such as fluxai, aivideos .
there is basically no vram usage limit when using big models, ai image & video generation and those dudes run a lot of gpus in same rig
Nvidia went away with nvlink on the 4000 series, I doubt the 5090 will have it.
But yeah, running 2 of those with PCIe 5.0 is gonna be really nice and still really useful for the cases you mentioned.
Posted on Reply
#200
x4it3n
friction_point5090 feels slow already. I need to go faster.
Let's wait and see for the reveal tomorrow... but I'm betting on 50% in Raster & 60-70% in RT/PT !
Posted on Reply
Add your own comment
Jan 8th, 2025 04:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts