Thursday, December 1st 2022

NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

The monstrous NVIDIA GeForce RTX 4090 "Ada" graphics card has the potential to be a very efficient high-end graphics card with a daily-use undervolt, and with its power-limit halved, finds an undervolting adventure review by Korean tech publication Quasar Zone. The reviewer tested the RTX 4090 with a number of GPU core voltage settings, and lowered software-level power-limits (down from its 450 W default).

It's important to note that 450 W is a very arbitrary number for the RTX 4090's power limit, the GPU rarely draws that much power in typical gaming workloads. Our own testing at stock settings sees its gaming power draw around the 340 W-mark. Quasar Zone tested the RTX 4090 with a power limit as low as 60% (270 W). With its most aggressive power management they could muster (i.e. 270 W PL), the card was found to lose just around 8% of performance at 4K UHD, averaged across five AAA games at maxed out settings. The story is similar with undervolting the GPU down to 850 mV, down from its 1 V stock. In both cases, the performance loss appear well contained, while providing a reduction in power-draw (in turn heat and noise).
Sources: VideoCardz, Quasar Zone
Add your own comment

64 Comments on NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

#1
Hyderz
thats.. quite alot less power for very little perf loss
Posted on Reply
#2
JAB Creations
Looks like Nvidia has maxed out the cheap approaches to gaining performance.
Posted on Reply
#3
Flydommo
Quite impressive. Nvidia could have developed smaller, more efficient and slightly less powerful graphics cards, but instead opted for a monstrous three-slot high-powered design. It's a shame they didn't opt for a leaner, more modern approach.
Posted on Reply
#4
phanbuey
yep... Im at full performance at 325W - to be fair though - I have a .950 UV and it does hit 340W in CB2077
Posted on Reply
#5
nguyen
phanbueyyep... Im at full performance at 325W
97% perf at 250W here :D
Posted on Reply
#6
MentalAcetylide
Which raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
Posted on Reply
#7
Chaitanya
Der8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.
MentalAcetylideWhich raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
Posted on Reply
#8
ratirt
MentalAcetylideWhich raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
You have answered your question. If these cards were not pushed so far with the power they would not require but since these are pushed that far and are advertised as "that fast" they require.
Posted on Reply
#9
Renald
No shit sherlock.... Nothing new here.
Overclocking for the crown of "The fastest GPU possible" always had a huge impact on consumption, especially GPU, in order to get the crown for a few percentage over the other.
Posted on Reply
#10
nguyen
MentalAcetylideWhich raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
Every design choice is simpler when you release product after your competitor ;)
Posted on Reply
#11
MentalAcetylide
ChaitanyaDer8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.


My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
It seems like the bean counters in the marketing department has the louder voice in the company if the architects are finding these specs just as absurd while Nvidia could care less.
Posted on Reply
#12
InVasMani
MentalAcetylideWhich raises another question: Do these cards' dimensions really need to be so big?
The morons working at Nvidia's R&D department have it in their minds to make these products as big & power-hungry as possible for some other weird arcane reason. Sure, higher MSRPs and profit, but that alone doesn't make much sense to me.
R&D department isn't at fault for the CEO's direction. I'm sure their hands were tied.
Posted on Reply
#13
Space Lynx
Astronaut
InVasManiR&D department isn't at fault for the CEO's direction. I'm sure their hands were tied.
As much as I dislike Nvidia, CEO's hands are often tied too, to the shareholders. Nvidia needed to show they had the most badass card around by a large margin, to prove dominance to the shareholders.
Posted on Reply
#14
phanbuey
It's probably because not all the silicon can hit these clocks - im sure there are some doggo 4090s out there that need 400W for 2700mhz
Posted on Reply
#15
Space Lynx
Astronaut
phanbueyIt's probably because not all the silicon can hit these clocks - im sure there are some doggo 4090s out there that need 400W for 2700mhz
I just assumed bad silicon would have been sent down the line for the 4080 launch.
Posted on Reply
#16
phanbuey
Space LynxI just assumed bad silicon would have been sent down the line for the 4080 launch.
That's a different chip tho - there's not a harvested 4090 AFAIK. Ah wait nvm, im confused with the other 4080....
Posted on Reply
#17
fancucker
Their midrange and budget options are going to be very impressive efficiency wise
Hopefully AMD can deliver on their claims and combat them
Posted on Reply
#18
Space Lynx
Astronaut
fancuckerTheir midrange and budget options are going to be very impressive efficiency wise
Hopefully AMD can deliver on their claims and combat them
I think chiplet design gpu is going to surprise all of us. I expect the 7900 xtx will beat a 4080 in several games, just not all (no ray tracing on).
Posted on Reply
#19
PLAfiller
MentalAcetylideWhich raises another question: Do these cards' dimensions really need to be so big?
Bigger is better! - member of the marketing team :) Imagine a 4090 was the size of a low profile GTX1650. But it is 4090 and you are looking at your mega-build with a $600 motherboard and the biggest RGB ram sticks you can get....how would that make you feel? Your sound card and your RAM is bigger than your video card? It's like buying a Ford car with a 1.0 Eco engine....looks like it's built from lego bricks. :D
Posted on Reply
#20
the54thvoid
Super Intoxicated Moderator
Backwards move from Nvidia. It would have been preferable on practically every front to release at the lower power level with still incredible performance. They'd probably have priced lower too, though not by much I imagine.

Instead they've adopted the CPU strategy - max out the silicon at all costs for max clocks and performance.
Posted on Reply
#21
xorbe
This isn't new news. The wide gpus have always been like this, whereas the the small gpus definitely need the MHz cranked up.
Posted on Reply
#22
evernessince
The power consumption may be down to silicon quality variability and Nvidia wanting to ensure as many cards as possible can hit rated frequencies.
Posted on Reply
#23
Crackong
According to their graphs

PL 60 % = 268W
268/347 = 0.77

Undervolting
232/347 = 0.67

So the 【PL60%】is super deceiving when it actually consumes 77% of the default power consumption.
60% should be~210W, now it is 28% more than advertised.
Posted on Reply
#24
theGryphon
Yawn.

Look, 4090 was a necessary beast. Being the top dog serves so many purposes from brand image to corporate valuation.

Problem is, low-power designs get reserved for the mobile solutions, whereas desktop is left with power hogs. It's been too many years we don't see a good <75W, cordless, low-profile cards. Both NVIDIA and AMD... Such a shame, when they are very much able.
Posted on Reply
#25
N/A
CrackongAccording to their graphs

PL 60 % = 268W
268/347 = 0.77

Undervolting
232/347 = 0.67
PL 100% is 450W
Posted on Reply
Add your own comment
Dec 21st, 2024 10:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts