Tuesday, August 1st 2023
AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled
AMD Radeon RX 6000 and RX 700 series based on RDNA 2 and RDNA 3 GPU architectures have been benchmarked by folks over at ComputerBase. However, these weren't regular benchmarks of performance but rather power consumption. According to their latest results, they discovered that enabling Variable Refresh Rate (VRR) can lower the power consumption of AMD Radeon cards in idle. Using a 4K display with a 144 Hz refresh rate, ComputerBase benchmarked Radeon RX 6800/6700 XT and RX 7900 XT, both last-generation and current-generation graphics cards. The performance matrix also includes a comparison to Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080.
Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.
Source:
ComputerBase.de
Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.
94 Comments on AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled
With VVR off I go back to 50 watts idle and with it on 6-7 Watts Idle.
do you understand how utterly ridiculous that sounds? Because every time you want to use the dGPU for, you know, GPU thigs, you now have to move the cable back?
Seems pretty obvious.
So it's not a fix, just a circumstance where the problem hides itself. Good find if you happen to own a VRR monitor though. It doesn't have to bankrupt you, that was never a criterion. If you make $600/month ($7,200/year), $200 is significant though.
It's all so tiresome. The last few years there has been a LOT of this showing up on tech forums any time power use is brought up, like we're all supposed to run our GPUs at 1 MHz or something. Someone making $7200 per year is not going to have a $1000 GPU. The entire comparison is utterly ludicrous.
If you cannot afford the electricity for a high end GPU, if 20w difference is a big deal, YOU CANNOT afford a big GPU. PERIOD. And people who cannot afford a top tier GPU should stop complaining about their power draw, it is not a concern for them.
If you earn $600 a month and pay at least $400 on bills and have the rest for food, medicals, and other stuff, are you seriously gonna save up to buy a $1,000 graphics card? Do you think it's wise?
You can buy a powerful car or graphics card but i don't think it's to store it in a garage or puting it on a shelve right ?
I mean, it's not the end of the world, but if I'm paying for a super-complicated power saving engine (among other things), then it better do its job. Can we at least agree on that?
Also, what job needs a 144 Hz 4K monitor? I'm curious. Except that a high-end GPU is not a power saving engine of any sorts.
Dont knock people who dont earn much by saying they shouldnt, or wouldnt, buy a high end config.
Thats just plain wrong.
Also, top gpu's havent always been over a grand, its a recent thing, and we shouldnt be encouraging it.
Look at the price of flagship phones now!!
You can get a 2nd hand car for that price...
I like small cars & don't want to waste half on my life/on road in traffic jams!
30W wasted is an appalling waste.
My laptop doesnt do that at desktop idle. My processor doesnt do that at idle.