• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 5090 Features 575 W TDP, RTX 5080 Carries 360 W TDP

Oh how I wish developers would start optimizing their games again, just look how fantastic DOOM ran on Vulkan, it can be done, but they use these technologies that nVidia offers as a crutch and I absolutely despise them for it. :banghead:

Doom is mostly a shooter on rails, or by sections, they couldn't pull that off in a open world for example. Not to say they didn't do a great job but not all games are equal.
 
Im not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
Ah, I get 'ya. To me, that's just Nvidia not having a clue how to extract more performance out of their GPU architecture other than cramming more parts in it and raising power. I don't know how long this can continue.

Doom is mostly a shooter on rails, or by sections, they couldn't pull that off in a open world for example. Not to say they didn't do a great job but not all games are equal.
Why not?
 
4090 also had a max TGP of 600W, so 5090 will probably draw the same amount of power or slightly higher. Maybe AIB 5090 will use around 520W at stock.
Can no doubt power limit to 400W with minimal FPS loss.
 
Im not asking them to test at my individually set power level, but if the tiers remained at the same power I could make an educated guess. EG. the 4090 FE was pulling 360w in TPU's review, if the 5090 FE also pulls 360w, I can make an educated guess about my usecase. If on other hand the 5090 pulls 550w, ill have absolutely no clue how it would compare at iso power to the 4090.
Or set both 4090 and 5090 at 360W, and extrapolate the performance increase difference/delta.
Also set both at 4090's max power/performance limit, and see how much further the advancement has gone. And whether there are any efficiency gains at same power usage.
 
he says over 575 W. those 12VHPWR have 600w limit and 75w from pci slot? perhaps they need two or three 12VHPWR
Can multiple 12VHPWR connectors be used in parallel? I haven't seen this discussed yet but it's becoming a necessity. Given that they're "smart" in a sense, with signaling to communicate available power etc., it might not be trivial.
 
On AMD, TDP is total card power.
Not on RDNA2 I'm afraid. Measured my PC one day and whilst whatever OSD showed me 185ish watts on my 6700 XT the wall sensor clearly showed me that the consumption went up by about 220 watts. Can't say it's a big deal though, the main problem of this GPU is how weak it is. Ongoing "we need more VRAM" fiesta all around and I'm clearly 3+ GB away from maxing it out but framerates already cease to be. Wouldn't have minded having about double that at 12 GB if I'm being honest with you.

Anyhow, these 360 and 575 feel predictable. What remains unclear is if there are some IPC improvements. NV got no reason to improve it but they also had no rush to discount Ada by making better price/performance Super SKUs. Ada sold fine regardless.
 
Can no doubt power limit to 400W with minimal FPS loss.

My 4090 run at stock clock with 110mV undervolt, resulting in stock performance at only 300-330W, if the sweetspot for 5090 is at 400W is would not be a problem for my 850W PSU
 
I miss the days when we were a small handful of "lepers" where nerd/geek was a curse word, I don't want to share our GPUs with people that treated us badly at one point or another. :p
an accidental geek at that but a mechanical background and electrical education the hardest part was learning software. I as almost 50 before I got my first computer. When I was growing up, electronics was a portable AM radio. Now the curse word is liberal.
 
Not on RDNA2 I'm afraid. Measured my PC one day and whilst whatever OSD showed me 185ish watts on my 6700 XT the wall sensor clearly showed me that the consumption went up by about 220 watts. Can't say it's a big deal though, the main problem of this GPU is how weak it is. Ongoing "we need more VRAM" fiesta all around and I'm clearly 3+ GB away from maxing it out but framerates already cease to be. Wouldn't have minded having about double that at 12 GB if I'm being honest with you.
RDNA 2 doesn't report total board power through software, only GPU chip power. Even though AMD advertises total board power as TDP, you won't see it anywhere on RDNA 2. RDNA 3 does report it, though.

So the numbers you're seeing are correct.

I agree with you on the VRAM argument. I feel like 12 GB is decent for my 6750 XT, at least I don't need to be afraid of running out of it before I run out of GPU grunt.
 
Can multiple 12VHPWR connectors be used in parallel? I haven't seen this discussed yet but it's becoming a necessity. Given that they're "smart" in a sense, with signaling to communicate available power etc., it might not be trivial.
there are 2 versions of 4090 galax HOF using 2 pcie gen 5 16 pin connectors
we don't know what PNY is cooking
 
Is that maximum TGP or Default. rtx 4090 has 600W for maximum and 450 as default. Well it will be revealed soon enough.

And that pcb shot looks like it's been taken from PC Partner card(Zotac et al, see Zotac rtx4090 AMP review), so probably not from nvidia reference design.
 
So the 5090 has 25 W for the VRAM and other components, but the 5080 has 40 W? That doesn't compute, even with the 28/30 Gbps speed difference. The 5090 has double the VRAM chips (or density?), a much more complex PCB and a much beefier power delivery.

Power use scales exponentially with clockspeed increases, not linearly. Plus it’s likely that nVidia is going to be binning for the absolute best of the best in terms of efficiency in for the 5090 (given the fact that it is literally 2x a 5080 this is kind of mandatory) while leaving the “lesser” GPU/VRAM chips for the 5080.
 
Power use scales exponentially with clockspeed increases, not linearly. Plus it’s likely that nVidia is going to be binning for the absolute best of the best in terms of efficiency in for the 5090 (given the fact that it is literally 2x a 5080 this is kind of mandatory) while leaving the “lesser” GPU/VRAM chips for the 5080.
The 5080 is an entirely different chip. It's based on the GB203, while the 5090 is on the GB202.

Loool c'mon now. It's because of capitalism that you are able to buy what you want/need.
Because you couldn't buy what you wanted before capitalism was invented? C'mon now... ;)
 
More so, the 5090 won't use the fully enabled version, either. 5090 Ti coming later, perhaps? Or will it be reserved for industrial cards?
 
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
 
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
I'd buy a 2000 watt GPU, don't really care. To me it's like asking "who's gonna buy a TV that comes with 100% brightness out of the box". I don't care that much, ill just lower the brightness the same way I'll lower the power draw on the GPU.

Your AC is a good example. Didn't you change it's settings? Did you run your AC out of the box?
 
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
It's totally excessive and unnecessary, just like a 4090 is for most people.
 
Who's gonna buy 575 W GPU unless they live in Iceland or Grenland? Imagine gaming in summer when it's +43C outside, which is like every year nowo_O I'm not gonna overload my AC, just too game.
Well I might get one just for winter months as I'm heating my PC room with 600W IR panel + 4070TIS atm. Maybe I could replace IR heater with 5090:rolleyes:
The 5090 is almost double the power usage of my 7900XT which already heats up my tiny gaming room. The room would be hotter than a sauna with the 5090!
 
Back
Top