Friday, January 3rd 2025
NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP
According to two of the most accurate leakers, kopite7kimi and hongxing2020, NVIDIA's GeForce RTX 5090 and RTX 5080 will feature 575 W and 360 W TDP, respectively. Previously, rumors have pointed out that these GPU SKUs carry 600 W and 400 W TGPs, which translates into total graphics power, meaning that an entire GPU with its RAM and everything else draws a certain amount of power. However, TDP (thermal design power) is a more specific value attributed to the GPU die or the specific SKU in question. According to the latest leaks, 575 Watts are dedicated to the GB202-300-A1 GPU die in the GeForce RTX 5090, while 25 Watts are for GDDR7 memory and other components on the PCB.
For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
Sources:
hongxing2020 and kopite7kimi, via VideoCardz
For the RTX 5080, the GB203-400-A1 chip is supposedly drawing 360 Watts of power alone, while 40 Watts are set aside for GDDR7 memory and other components in the PC. The lower-end RTX 5080 uses more power than the RTX 5090 because its GDDR7 memory modules reportedly run at 30 Gbps, while the RTX 5090 uses GDDR7 memory modules with 28 Gbps speeds. Indeed, the RTX 5090 uses more modules or higher capacity modules, but the first-generation GDDR7 memory could require more power to reach the 30 Gbps threshold. Hence, more power is set aside for that. In future GDDR7 iterations, more speed could be easily achieved without much more power.
207 Comments on NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP
Think about it this way - is something like a Porsche 911 GT3 RS a fast car, the “best” 911 on offer? Yup. Is it a full on race car for professionals though? No, it’s still a street legal vehicle. Does it make sense to buy it for daily driving? Not particularly. Can an average customer with access to money that can buy one even extract full potential of one? Fuck no. Does Porsche really care what you buy it for or justifies its existence? Again, no. Because that way lies madness and one can conclude that the only people who SHOULD buy one are already incredibly good drivers to use on track days. It even might make sense, this logic, but it isn’t how the world works.
For AI, ML, DL or CUDA Geforce is where it's at. And even at the price the X090 cards are at or the Titans before them slapping four of them in a box and going from there is cheap and economical. Some systems pack six or eight X090 cards. Freelancers, contractors, businesses, universities and more go out and buy gobs of the cards often in premade systems and racks because it's cheap.
What's justifying the existence of these cards with the specs they have isn't gaming performance just as the export bans on them aren't out of fear China will stomp people in CS2 or Overwatch. It's for their actual intended justification as compute cards.
CUDA changed everything. Geforce is not a gaming brand and has not been for a good long time. It's a compute brand. I fully expect the higher end of the range to keep running away in price and performance in the future as there is still tons more performance wanted and tons more money people are willing to spend. If you doubled the price for 25% more perf it would still sell out.
Also, shader core count is not everything. Vega 64 and the 5700xt had the same ROP count. Vega 56, despite being cut down, was within margin of error of the vega 64 by comparison, suggesting vega 64 was ROP limited. Even vega 56 had issues too, the HBM was too latent to allow the GPU to fully stretch it's legs.
In short: not an arch issue, but instead multiple design issues that gimped the cards actual potential. Much like the 7900xtx, vega 64 should have been a tier higher given it's core size and power use but was held back due to design issues.
I'm just saying it's a bit too expensive to be used as a paperweight (or a gaming card), imo.
Now I’m back down to 850W
But the Kilowatt + PSU is pretty much becoming mandatory…
Next year I'm going for aviator oxygen and a leather jacket.
This thing is much more faster than 20-30% vs 4090 if u meaning that?
And im sure u will pass it anyway and going to buy another AMD low/mid tier gpu U have very very poor AC if it overloads extra 575w
We have 4 PC all+300w and room temps not even raising.
And u know, its not using 575w when its in u PC u need to stress it 100%
9800x3d 176w
rtx 5090 575w
everything else in the case 100w
100w reserve
951w total power
2x 951w= 1902w
where's that 2kw psu? and we don't talk about sli/nvlink setup.
optimised setup rtx 5090: ?430w 75%+176wcpu+200w others = 806w x2 = 1600w psu those are in retailers shops
I've not really touched my "luxury spending" account for over a year, so am tempted to get the 5090 if it's less than £2k...
I've been on 1600p for 13 years so am eyeing up those 4k 240hz OLEDs. Worried about that sweet burn in and Nvidia vendor locking DLSS 5 to the 6* series. Imagine spending £2k for that to happen... Will probably do some research after it comes out, second guess myself, then not spend anything for another year :laugh: Something about pots and kettles. You sort of spent the first page harrasing someone for being "a slave to buying things" followed by passive aggressive remarks about said slavery and America...
Also why would you reserve 200W for the rest of the system? That's a lot, in gaming scenarios we should be under 40~50W for the whole rest of it.
700W total is a reasonable system power estimate.
If you want maximum efficiency the load range is 40 to 60% and you can still go a bit over without much loss.
With this in mind a 1200W PSU is already more than enough. And even a barely decent 850W would probably do the trick. The power rating of the PSU doesn't need to be divided by 3, there already is a safety margin.
Let's be clear once again: I do not have a problem with anyone who wants a 5090. Just don't shove it under my nose out of a misguided superiority/inferiority complex, will 'ya? :)
The GPU you buy does not make you better or worse. It's just an object. A piece of silicon, metals and plastic. Using it for bragging rights implies a very sad way of existence.