Friday, January 3rd 2025
NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP
According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.
For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources:
hongxing2020 and kopite7kimi, via VideoCardz
For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP
I've had Titans, and X090s at home. Don't game on them. A few odd rounds of Quake 1 maybe. I do have a Switch I game on, a PS5 I game on, a Macbook Pro I game on, and a gaming PC that only has an X080 series that I rarely game on. The consoles get the most time.
There's a narrow margin of people with the money where these things can be splurged on for gaming and e-peen. Steam hardware puts the 4090 at 1.16% which is actually pretty high compared to past prosumer cards. Which tells you that of the volume sold most of them aren't touching a game.
Bassically PC gaming is a 1080p 8gb or less VRAM affair with lots of mid range cards that do get upgrade regularly and a bunch of X080 and X070 cards that get less upgrade. The X090 cards are professional cards that people in extreme cases which are extremely rare do buy and game on, but most never game at all. CUDA is where it's for nvidia.
1.16% of all surveyed Gpu proves people are indeed buying these to game on.
Most may do Pro work but clearly plenty were bought to game on as you inadvertently pointed out?!?
you may want to reserve 100w for AIO,fans,hdds,etc. and extra 100w reserve
but of course you can optimise power consumption and use a lower psu.
I know it's not 100% the same, but the 7800X3D in my system will never ever consume more than 100 Watts. It does 80 at full load. 90 if I enable EXPO (I don't because I just don't give a sh** about memory performance). Even if I overclocked it, it wouldn't gain more than 20 extra Watts max. Fans, pumps, and HDDs consume maybe 5 W each. How many of them do you have?
It needs a really expensive Platinum-certified 2000W PSU in order to live safely with the power spikes.
But it will still fail because of the low-quality power connector.
Solution: hell, don't buy!
Also I don't understand what is this 100W reserve? What is it's purpose? To account for what?
Because we already took into account the whole system, adding a further tolerance would only decrease the PSU efficiency since it would leave it's optimal range.
include the performance part of the efficiency equation and it turns out Ada is the the most efficient GPU architecture ever.
Citation:
www.techpowerup.com/review/msi-geforce-rtx-4090-gaming-x-trio/38.html See the chart linked above.
In my previous build I had 56cores 112threads cpu with ASUS ROG RTX4090OC.
With this build with 64cores cpu I wait for new nvidia cards. Back to 512bit bus after many many years. :)
And yet, go aback to the 1080ti reviews and there are people there whining about pascals power use especially at higher core speeds and how Nvidia is throwing power out the window.
When it comes to Watts per frame, yeah Nvidia is by and far in the lead. And for the record, it wasn't just efficiency, with Maxwell and pascal, it was efficiency, way better drivers, and actual performance while AMD was busy sucking on GCN 1 for the third time or not bothering to go over the 1060s performance. AMD genuinely sucked mid decade. We're already gpu bound at 4k and even 1440p today with the 4090. Especially with RT No you don't. This is FUD. the vast majority of older quality supplies can handle spikes without issue, and Ada's spikes are nowhere near as bad as amperes.
Platinum pays are not even that expensive anymore. The titanium's are the wallet breakers. Something tells me you were never in the market for a 5090 in the first place.
d5 next pump = 25w
fans might vary - some have 3, some have 12 etc.
also you will have an much lower average power consumption,imho what matter when choosing a psu is peak power consumption
50% usage of psu-is this a myth or unnecessary biased of olden era?
btw, seems you are against of a cpu of 176w but have nothing about a gpu of 575w :):p
But 25 W on a HDD at startup? Nah, I call bullshit on that. Do you have a source?
no offence, but some of us seems very biased. If you don't agree with 176w for 9800x3d you will have 176w for 9950x3d anyway
so what psu one does really needs for 5090 575w at 75%?
Again, people voting with their asses instead of brains/wallets. "It's the fastest crap in the world, so I must have it" - bah. (I'm not talking about people doing professional work on their GPU, of course)
So impact is 0c in room temps because of that.
U dont know what AC is,right?
its not a Hevy band.. U should know Energy Efficiency and power usage is 2 dif things.
Maxwell and pascal is just bad..
4000 series is most energy efficient gpus there is Atm
Power usage and performance =Energy Efficiency
If GPU use +300w or more that dosent mean it cant be Energy Efficient
Rtx4090 +350w gpu is more Efficient than Rtx3050