Tuesday, August 1st 2023
AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled
AMD Radeon RX 6000 and RX 700 series based on RDNA 2 and RDNA 3 GPU architectures have been benchmarked by folks over at ComputerBase. However, these weren't regular benchmarks of performance but rather power consumption. According to their latest results, they discovered that enabling Variable Refresh Rate (VRR) can lower the power consumption of AMD Radeon cards in idle. Using a 4K display with a 144 Hz refresh rate, ComputerBase benchmarked Radeon RX 6800/6700 XT and RX 7900 XT, both last-generation and current-generation graphics cards. The performance matrix also includes a comparison to Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080.
Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.
Source:
ComputerBase.de
Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.
94 Comments on AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled
*I switched to Arc and saved .01 cents over 6 years
A 20-30W difference in videocard usage may sound like peanuts but when the computer is on 24/7, it adds up. The increase in idle power alone can add up to 10% of the monthly threshold.
With FreeSync Premium on, haven't tested but will def check that Sure and even then you're still talking about money sub 1 euro / month.
1 kWh costs 60 cents and that is my EU worst case scenario in the middle of wartime go figure.
Mountain < molehill Exactly nothing Exactly this, PC boot time is what, 30 seconds from power on to desktop unless you run a potato.
But even despite all these things, sure this could be handled better by the card, I don't disagree with that sentiment. Its just way overblown all things considered.
And as was said, its such a silly scenario, you live in a country with strict rules on powerconsumption from the grid (which btw im all in favor off, we need to reduce our footprint) but then you still just leave you pc on 24/7 and then buy a gpu revolving around that?
We all have SSD's these days and fast boot, and hell you also have sleep or hibernation mode where the pc wakes up faster then it takes my monitor to turn on.....so all of this makes no sense.
It's certainly not nothing, but if paying 10 euros per month bankrupts you, you're probably not gonna buy a 144 Hz 4K monitor, either.
Please check this about the presence of a text that may require a reduction in energy consumption by citizens and whether there is no part that affects computers.
The advanced advanced version will be when Windows acquires the ability to put individual monitors to sleep. You watch video on one monitor, or do other things that don't change the contents on other monitors, and they go to sleep (but they should support very fast wakeup).
Many local companies just cannot handle the increase in prices (gas prices also went tits up) and either fold, or increase the price of their products/services which further drives the inflation, making the average income have that much lower buying power. Advanced veryion is to reach out and turn the monitor off when you stand up from your desk.
Or Winkey+L to lock the session, which turns off the monitors after a few seconds.