Thursday, December 1st 2022

NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

The monstrous NVIDIA GeForce RTX 4090 "Ada" graphics card has the potential to be a very efficient high-end graphics card with a daily-use undervolt, and with its power-limit halved, finds an undervolting adventure review by Korean tech publication Quasar Zone. The reviewer tested the RTX 4090 with a number of GPU core voltage settings, and lowered software-level power-limits (down from its 450 W default).

It's important to note that 450 W is a very arbitrary number for the RTX 4090's power limit, the GPU rarely draws that much power in typical gaming workloads. Our own testing at stock settings sees its gaming power draw around the 340 W-mark. Quasar Zone tested the RTX 4090 with a power limit as low as 60% (270 W). With its most aggressive power management they could muster (i.e. 270 W PL), the card was found to lose just around 8% of performance at 4K UHD, averaged across five AAA games at maxed out settings. The story is similar with undervolting the GPU down to 850 mV, down from its 1 V stock. In both cases, the performance loss appear well contained, while providing a reduction in power-draw (in turn heat and noise).
Sources: VideoCardz, Quasar Zone
Add your own comment

64 Comments on NVIDIA GeForce RTX 4090 with Nearly Half its Power-limit and Undervolting Loses Just 8% Performance

#51
spnidel
N/Aclearly 4090 is exacly 450 watts when properly loaded
477 watts with Ray tracing on.
277 watts with RT on plus DLSS only because one hell of a CPU bottleneck, it should be avoided as an example.
no no no, the proper way of testing the 4090's efficiency would be running games at a resolution of 800x600, with render scale set to lowest, and all the settings to minimum :^)
Posted on Reply
#52
ARF
dj-electricIm surprised this is news. This data was available during launch day, people didn't seem to care then.




For 10% performance loss you cut its PL from 450W to 300W.
This is where you lost your RTX 4080 at 300 W.
Instead you received a stupid RTX 4090 with insane overclock and stupid cooling, and not custom cards with water cooling as higher tier cards which offer up to 15% higher performance.

Smart engineering would be a 300 W reference RTX 4090 and 450 W super overclocked and water cooled RTX 4090 Ti.
Posted on Reply
#53
EatingDirt
I don't know why any of this is news. This is the case for almost 100% of PC CPU's & GPU's on the market today with the exception for low power laptop chips. Most GPU's & CPU's can be undervolted by some percentage and still perform within 5-10% of their maximum performance, but not all of them can, so everything is binned to the lowest common denominator so as many chips as possible can fit into the performance threshold they need to meet.
Posted on Reply
#54
MarsM4N
dj-electricIm surprised this is news. This data was available during launch day, people didn't seem to care then.




For 10% performance loss you cut its PL from 450W to 300W.
Should have used the dual BIOS, 300W stock & 450W OC. :shadedshu: They need to make it more user friendly for the "average Joe" to run their hardware more economical & ecological.
Posted on Reply
#55
Outback Bronze
Anybody mention that this would help alleviate them from burning up too?

Been doing undervolting/PL limit for years now. Wouldn't run them any other way.

If I did purchase a 4080/90 this would be a no briner for me.
Posted on Reply
#56
N/A
The way things go the GPUs are bound to reach 1kW at some point and undervolting can be the factory default.
Posted on Reply
#57
dj-electric
MarsM4NShould have used the dual BIOS, 300W stock & 450W OC. :shadedshu: They need to make it more user friendly for the "average Joe" to run their hardware more economical & ecological.
This is actually how im running my GPUs. one BIOS does include a green plan. I don't want to fire up a GPU OC tool every time i turn on my PC.
The RTX 4080 is sort of the same story, but slightly less dramatic of a drop. With it, you get your 90% performance at around 270W power.
In terms of PPW, the RTX 4090 is unsurprisingly much better here. Its easier to run more processing hardware at a lower power demand and get higher parallel compute performance. Its what we see on regular CPUs too.

For those who aren't fully updated - yes, you can edit values on Ada GPU BIOSes now
www.techpowerup.com/download/nvidia-nvflash/
Posted on Reply
#58
N/A
Edit with what. This is only a flash tool. How do i set -100 mV for example as I'm not looking to drop more than 1% performance that in this case means 17% lower power.
Posted on Reply
#59
R-T-B
dj-electricFor those who aren't fully updated - yes, you can edit values on Ada GPU BIOSes now
Edit or just crossflash? Editing seems quite the lofty claim.
Posted on Reply
#60
MxPhenom 216
ASIC Engineer
ChaitanyaDer8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.


My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
Its really is not that stupid, though id be curious to know what your cousin says. Doing it this way out of the box to run higher voltage basically is playing it safe by Nvidia engineers. Not all GPUs could run at some of these lower voltages out of the box and be 100% stable so nvidia has to account for the worst GPU of the lot. If Nvidia were to do it that way, imagine the cluster fuck that would ensue if people got $1k+ GPUs that werent stable. Giving some voltage head room covers all cards and makes sure they are stable, especially for the less savy people that don't want/or know how to tweak their hardware, etc. Undervolting/power limiting is then at the customers disgression as it should be really. Also the voltage scaling is done from Nvidia boosting algorithm so they are purposefully driving voltages in excess for more than just maximum performance.

The topic of this article should be no surprise to literally anyone based on Turing and Ampere GPUs. Both those generations could undervolt like crazy with damn near no performance loss. Why would it be any different for RTX4000?
Posted on Reply
#61
crubino
ChaitanyaDer8aur already showed how 4090 was pushed beyond optimum efficiency on launch day itself.


My cousin works as GPU architect in nVidia and will be talking about this stupid decision when we meet in few weeks time.
Stupid what??????

Are you sure your cousin is work on NVIDIA as GPU architect?
Before They decided the ideal power for 4090, they already tested and made more than 1000x attempted with some un-ideal conditions to make sure this GPU run as stable as it can with its all features activated (full Ray Tracing function, heaviest rendering workload, etc.)
And if your cousin truly working on nVIDIA GPU R&D dept. ; then actually he know that this "450 watt" actually was already the "win win solution wattage" value (rather than 600watt peak) to make everybody happy ;)

Do not always trusted some kids or media with their controversial news to get more attention. They test it with what condition? as Cyberpunk with RTX ON already hit more than 450watt for make sure its running stable on that workload. And many render condition that actually need more wattage than that.

You can tell this to your "virtual cousin" ;)
so he can tell that "every team member are moron and stupid" in the morning meeting. LMAO!
Posted on Reply
#62
TheMadRusski
I feel like 600w is perfect for the 4090, I grabbed a 450w version and it was hitting the PL and crashing/dropping performance. Asus TUF comes up to 490-510w during heavy RT titles like Control or Cyberpunk, Hitman 3 especially, and I feel further down the line with more AAA titles taking advantage of UE5 Nanite and Lumen tech on top of RT. I'm using a 5800X, not seeing bottlenecking but I play 4K 120+ LG C1.
Posted on Reply
#63
Godrilla
You could look at this the other way at 25% over (600 watts) power target you can get 8% performance boost and here I am running my 4090 suprim liquid below recommended psu with my 750 watt sfx platinum psu since launch.:toast::cool:
Posted on Reply
#64
ItsAdam
Most of the time my total system power consumption is 250-350w. It only ever been up to 500w running benchmarks.

No game has ever gone over 400w,that is entire system load.

RTX 4090 power consumption has been blown out of proportion by team red and random weirdos. This card uses less power than my 3090!

I will say DLSS is still a hot mess.
Posted on Reply
Add your own comment
Sep 26th, 2024 20:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts