Tuesday, August 1st 2023

AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

AMD Radeon RX 6000 and RX 700 series based on RDNA 2 and RDNA 3 GPU architectures have been benchmarked by folks over at ComputerBase. However, these weren't regular benchmarks of performance but rather power consumption. According to their latest results, they discovered that enabling Variable Refresh Rate (VRR) can lower the power consumption of AMD Radeon cards in idle. Using a 4K display with a 144 Hz refresh rate, ComputerBase benchmarked Radeon RX 6800/6700 XT and RX 7900 XT, both last-generation and current-generation graphics cards. The performance matrix also includes a comparison to Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080.

Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.
Source: ComputerBase.de
Add your own comment

94 Comments on AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

#1
TumbleGeorge
A few facts about VRR: Developed by AMD and Nvidia and first announced publicly at GDC 2014. It was first built into Windows 10 version 1703 released as an update for regular users in March 2018.
Posted on Reply
#2
Vya Domus
I never saw anything more than 30W idle on my 7900XT with or without VRR, one or dual monitor, I do not understand how people get these figures.
Posted on Reply
#3
TumbleGeorge
Vya DomusI never saw anything more than 30W idle on my 7900XT with or without VRR, one or dual monitor, I do not understand how people get these figures.
Article in computerbase(german language!) Probably have enough data to answer your question.
Posted on Reply
#4
lemonadesoda
Reading the graphs: Radeon under VRR has much improved, but nV and ARC are still better.

Posted on Reply
#5
qlum
sadly idle power consumption on a 4 monitor setup is still rather high, then again it's not like nvidia does any better there. Especially not on linux
Posted on Reply
#6
Guwapo77
lemonadesodaReading the graphs: Radeon under VRR has much improved, but nV and ARC are still better.
At this level of wattage, it really doesn't even matter.

*I switched to Arc and saved .01 cents over 6 years
Posted on Reply
#7
TumbleGeorge
Guwapo77At this level of wattage, it really doesn't even matter.

*I switched to Arc and saved .01 cents over 6 years
It will be if countries, for environmental reasons, impose regulatory requirements on the power consumption of computers and their components. And maybe there are already countries with such legislation?
Posted on Reply
#8
Guwapo77
TumbleGeorgeIt will be if countries, for environmental reasons, impose regulatory requirements on the power consumption of computers and their components. And maybe there are already countries with such legislation?
Instead of leaving the computer at idle, maybe one could simply turn the computer off and turn off the power strip.
Posted on Reply
#9
bubbleawsome
TumbleGeorgeA few facts about VRR: Developed by AMD and Nvidia and first announced publicly at GDC 2014. It was first built into Windows 10 version 1703 released as an update for regular users in March 2018.
Do you just plug everything into an “AI” and then post what it spits out? Are you just trying to up your post count or something? This isn’t the first thread this has happened in.
Posted on Reply
#10
TumbleGeorge
Guwapo77Instead of leaving the computer at idle, maybe one could simply turn the computer off and turn off the power strip.
But, the regulations if passed, or already passed, (will) affect mainly or only the hardware manufacturers and traders. They would hardly be dictating to consumers how to consume their purchase in this regard, though who knows. Let's imagine it this way, if manufacturers do not provide consumption below a certain value (depending on the mode of use) of their latest generation of hardware and generations that are still sold as new, it's will be banned. So, with different methods, including the use of VRR, to reduce consumption, these products can enter the criteria that allow them to be offered.
Posted on Reply
#11
ymdhis
TumbleGeorgeIt will be if countries, for environmental reasons, impose regulatory requirements on the power consumption of computers and their components. And maybe there are already countries with such legislation?
There's at least one country in the EU where if you pass a certain threshold of power consumption, the cost of electricity will start to skyrocket - the higher your consumption, the higher the price per kWh increases. So for example a household at 4000 kWh per year would pay 4-5x the amount that two households at 2000 kWh combined would, not just the 2x you get from the kWh amount alone.

A 20-30W difference in videocard usage may sound like peanuts but when the computer is on 24/7, it adds up. The increase in idle power alone can add up to 10% of the monthly threshold.
Posted on Reply
#12
Vayra86
Vya DomusI never saw anything more than 30W idle on my 7900XT with or without VRR, one or dual monitor, I do not understand how people get these figures.
I see 39-41W on a 3440x1440 @ 144hz desktop on the same card. It won't go lower either no matter what, even undervolted and -10% tdp
With FreeSync Premium on, haven't tested but will def check that
ymdhisA 20-30W difference in videocard usage may sound like peanuts but when the computer is on 24/7, it adds up. The increase in idle power alone can add up to 10% of the monthly threshold.
Sure and even then you're still talking about money sub 1 euro / month.
1 kWh costs 60 cents and that is my EU worst case scenario in the middle of wartime go figure.

Mountain < molehill
AusWolfA lower monitor resolution and/or refresh rate reduces idle power consumption. We've known this from the TPU reviews. So what's new?
Exactly nothing
Guwapo77Instead of leaving the computer at idle, maybe one could simply turn the computer off and turn off the power strip.
Exactly this, PC boot time is what, 30 seconds from power on to desktop unless you run a potato.

But even despite all these things, sure this could be handled better by the card, I don't disagree with that sentiment. Its just way overblown all things considered.
Posted on Reply
#13
AusWolf
A lower monitor resolution and/or refresh rate reduces idle power consumption. We've known this from the TPU reviews. So what's new?
Posted on Reply
#14
Vya Domus
ymdhisA 20-30W difference in videocard usage may sound like peanuts but when the computer is on 24/7, it adds up.
No, it's still peanuts, be real.
ymdhisThere's at least one country in the EU where if you pass a certain threshold of power consumption, the cost of electricity will start to skyrocket
Non argument, the same can be argued about literally anything. You used the toaster one too many times this month "oh no it passed the threshold".
Posted on Reply
#15
ZoneDymo
ymdhisThere's at least one country in the EU where if you pass a certain threshold of power consumption, the cost of electricity will start to skyrocket - the higher your consumption, the higher the price per kWh increases. So for example a household at 4000 kWh per year would pay 4-5x the amount that two households at 2000 kWh combined would, not just the 2x you get from the kWh amount alone.

A 20-30W difference in videocard usage may sound like peanuts but when the computer is on 24/7, it adds up. The increase in idle power alone can add up to 10% of the monthly threshold.
which country is that?
And as was said, its such a silly scenario, you live in a country with strict rules on powerconsumption from the grid (which btw im all in favor off, we need to reduce our footprint) but then you still just leave you pc on 24/7 and then buy a gpu revolving around that?

We all have SSD's these days and fast boot, and hell you also have sleep or hibernation mode where the pc wakes up faster then it takes my monitor to turn on.....so all of this makes no sense.
Posted on Reply
#16
AusWolf
Vayra86I see 39-41W on a 3440x1440 @ 144hz desktop on the same card. It won't go lower either no matter what, even undervolted and -10% tdp
With FreeSync Premium on, haven't tested but will def check that


Sure and even then you're still talking about money sub 1 euro / month.
1 kWh costs 60 cents and that is my EU worst case scenario in the middle of wartime go figure.

Mountain < molehill


Exactly nothing


Exactly this, PC boot time is what, 30 seconds from power on to desktop unless you run a potato.

But even despite all these things, sure this could be handled better by the card, I don't disagree with that sentiment. Its just way overblown all things considered.
If you really pay 60 cents per kWh, then a 20 W difference on a 24/7 running PC will cost you €0.6 * 0.02 kW * 24 h * 31 days = €8.93 more per month.

It's certainly not nothing, but if paying 10 euros per month bankrupts you, you're probably not gonna buy a 144 Hz 4K monitor, either.
Posted on Reply
#17
Vayra86
AusWolfIf you really pay 60 cents per kWh, then a 20 W difference on a 24/7 running PC will cost you €0.6 * 0.02 kW * 24 h * 31 days = €8.93 more per month.

It's certainly not nothing, but if paying 10 euros per month bankrupts you, you're probably not gonna buy a 144 Hz 4K monitor, either.
24/7 running idle... do you ever? I dont.. thats a major waste of energy to begin with and kinda defeats any point of wanting to save pennies
Posted on Reply
#18
TumbleGeorge
Inflation reduction act.
Please check this about the presence of a text that may require a reduction in energy consumption by citizens and whether there is no part that affects computers.
Posted on Reply
#19
kapone32
My GPU already goes below that since the latest driver and I have a Freesync 4K 144hz monitor,
Posted on Reply
#20
Wirko
Guwapo77Instead of leaving the computer at idle, maybe one could simply turn the computer off and turn off the power strip.
The advanced version of that is to let the monitors go to sleep.

The advanced advanced version will be when Windows acquires the ability to put individual monitors to sleep. You watch video on one monitor, or do other things that don't change the contents on other monitors, and they go to sleep (but they should support very fast wakeup).
Posted on Reply
#21
zlobby
LOL! It actually increases for nvidia cards.
TumbleGeorge(german language!)
I know, right? :D
Posted on Reply
#22
ymdhis
ZoneDymowhich country is that?
And as was said, its such a silly scenario, you live in a country with strict rules on powerconsumption from the grid (which btw im all in favor off, we need to reduce our footprint) but then you still just leave you pc on 24/7 and then buy a gpu revolving around that?
Hungary. The prices were changed in a way that up to 210kWh you pay the old, government subsidy prices, above that power will cost 5x as much for the part beyond the threshold. If your power usage cost 50€ before, it now costs ~215€. Keep in mind that the average income around here is around ~600€ (in the suburban areas it can be much lower, and pensions can be drastically lower), and on top of that food prices climbed up to twice the value for most essential items in just two years. My dads pension was less than 200€ and he had to use electric heating during winter months. The example I mentioned was his power bill and how much he would need to pay today. I had to change a few things around the house to push my power consumption way down, otherwise I would've had to pay twice as much as I do today.

Many local companies just cannot handle the increase in prices (gas prices also went tits up) and either fold, or increase the price of their products/services which further drives the inflation, making the average income have that much lower buying power.
WirkoThe advanced version of that is to let the monitors go to sleep.

The advanced advanced version will be when Windows acquires the ability to put individual monitors to sleep. You watch video on one monitor, or do other things that don't change the contents on other monitors, and they go to sleep (but they should support very fast wakeup).
Advanced veryion is to reach out and turn the monitor off when you stand up from your desk.
Posted on Reply
#23
bug
They do better at idle, but even doing simple things like moving windows or watching something on YouTube brings back the high power draw. This looks more like someone hacked something specifically for idle, than like a proper fix.
Posted on Reply
#24
AnotherReader
lemonadesodaReading the graphs: Radeon under VRR has much improved, but nV and ARC are still better.
Intel isn't better in any scenario. Remember that you should be comparing the A770 to the Radeons with a 256-bit memory bus for this comparison, namely the RX 6800, 6800 XT and 6950 XT. With the exception of the 6950 XT, the rest do better than Intel in all scenarios. At idle, the Radeons are even better than Nvidia, and Intel is simply ridiculous: 46 W idling on a 60 fps monitor.
Posted on Reply
#25
Jokii
ymdhisAdvanced veryion is to reach out and turn the monitor off when you stand up from your desk.
You can do Winkey+P if you have multiple monitors.
Or Winkey+L to lock the session, which turns off the monitors after a few seconds.
Posted on Reply
Add your own comment
Nov 29th, 2024 22:57 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts