Tuesday, August 1st 2023

AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

AMD Radeon RX 6000 and RX 700 series based on RDNA 2 and RDNA 3 GPU architectures have been benchmarked by folks over at ComputerBase. However, these weren't regular benchmarks of performance but rather power consumption. According to their latest results, they discovered that enabling Variable Refresh Rate (VRR) can lower the power consumption of AMD Radeon cards in idle. Using a 4K display with a 144 Hz refresh rate, ComputerBase benchmarked Radeon RX 6800/6700 XT and RX 7900 XT, both last-generation and current-generation graphics cards. The performance matrix also includes a comparison to Intel Arc A770, NVIDIA GeForce RTX 3060 Ti, RTX 3080, and RTX 4080.

Regarding performance figures, the tests compare desktop idle consumption, dual monitor power consumption, window movement, YouTube with SDR at 60 FPS, and YouTube with HDR at 60 FPS, all done on a 4K 144 Hz monitor setup. You can see the comparison below, with the most significant regression in power consumption being Radeon RX 7900 XTX using 81% less power in single and 71% less power in dual monitor setup.
Source: ComputerBase.de
Add your own comment

94 Comments on AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

#51
TheoneandonlyMrK
TumbleGeorgeInflation reduction act.
Please check this about the presence of a text that may require a reduction in energy consumption by citizens and whether there is no part that affects computers.
Maybe make a thread for your off topic stuff ?!.
lemonadesodaThere are people posting here with incredibly poor maths and judegement who cannot multiply out 30W extra usage 24/7 or thereabouts. When the consequential cost is significant their defence is "oh, power off". No. We are not talking about the efficient use of the PC, we are talking about the efficiency of a component.

30W wasted is an appalling waste.

My laptop doesnt do that at desktop idle. My processor doesnt do that at idle.
You think or can prove?

30 watts idle while on, , most modern monitors have power saving features, , like the pc attached, ,,, , so after 5/10 idle minutes it's off.

Your being hyperbolic Nvidia and Intel are Not That much better.
Posted on Reply
#52
Evildead666
AusWolfConsidering how fast high-end systems lose their value, and how much that 100 bucks are worth for people on a low budget, doing that is absolutely stupid.


I'm not encouraging it. All I'm saying is that the average gamer (especially one on a low budget) doesn't need a flagship GPU.


Do you do that as a full-time position for $7,200 a year with no other source of income?
Its not up to you to decide who does and doesn't need a flagship GPU.
Some people buy nice things, and keep them for a few years, to make the most of it.

Would you buy a new car every year ?

In the grander scheme of things, reducing idle power for potentially millions of users benefits all of society.
Posted on Reply
#53
R0H1T
30W isn't a you can "scoff it off" easily number, having said that you can save more or a lot more by setting your display to sleep quickly when at idle. At least those with big/high refresh screens, outside of dGPU I'd argue display is now the most power hungry component of a build. Talking about avg consumption for a normal home PC of course, just like your mobile/laptop these days.
Posted on Reply
#54
Dave65
Guwapo77Instead of leaving the computer at idle, maybe one could simply turn the computer off and turn off the power strip.
Everyone says never shut PC down which I think is bullshit, I shut down and turn power strip off when not in use.
Posted on Reply
#55
AusWolf
Evildead666Its not up to you to decide who does and doesn't need a flagship GPU.
Some people buy nice things, and keep them for a few years, to make the most of it.
Well, if I earned $7,200 a year, I wouldn't be looking at a $1,000 GPU, or a 4K monitor. Actually, I earn way more than that, and still don't think a 4K display, or a computer to power it is in my range.

If you're on a super low budget and looking for a super-high-end PC for the money you saved with years of hard work, by all means, knock yourself out, but let me have my opinion on it.

Edit: All I'm saying is that a high-end PC depreciates in value so fast that a little increase in your power bill is the least of your concerns.
Evildead666Would you buy a new car every year ?
No because it's pointless. Not to mention it's a bad example, because a car is a car even 10 years after you've bought it, but your high-end PC won't play the latest games a few years later.
Posted on Reply
#56
ZoneDymo
Dave65Everyone says never shut PC down which I think is bullshit, I shut down and turn power strip off when not in use.
could also always atleast use sleep mode, borderline no powerconsumption and up and running again before you monitor is.
Posted on Reply
#57
Vayra86
Evildead666I know people who put 100 bucks aside every month, and then buy a kick ass config every 5 years.
Dont knock people who dont earn much by saying they shouldnt, or wouldnt, buy a high end config.
Thats just plain wrong.
Also, top gpu's havent always been over a grand, its a recent thing, and we shouldnt be encouraging it.
Look at the price of flagship phones now!!
You can get a 2nd hand car for that price...
Yep... you don't need to be rich to be gaming on high end PCs - or at least, highly effective gaming configs... honestly. I've been doing that math for years, if you play it smart and don't upgrade for every fart, this is a rather cheap hobby. Games included. All it takes is patience. Patience to wait for games to reach the budget bin, patience to jump on a great deal for a part. Patience on the PC pays out bigtime, not only is it cheap, but your stuff comes bug ridden and feature complete too.
Posted on Reply
#58
lemonadesoda
I do encourage everyone to visit the original article at www.computerbase.de/2023-07/grafikkarten-leistungsaufnahme-idle-2023/

If the German is too difficult, use your browser’s inbuilt translator.

Go look at the original graphs. There is a lot of information there.
Go read the comments. They are better informed than many here, because they are using the source article as the basis of their comments.

And just to spell it out, VRR is required on the driver, and required on the monitor too! Don’t forget that. Otherwise the power consumption results don’t change.
Posted on Reply
#59
TheoneandonlyMrK
I just make sure it Never idles, present break from F@H due to holiday not withstanding :)
lemonadesodaI do encourage everyone to visit the original article at www.computerbase.de/2023-07/grafikkarten-leistungsaufnahme-idle-2023/

If the German is too difficult, use your browser’s inbuilt translator.

Go look at the original graphs. There is a lot of information there.
Go read the comments. They are better informed than many here, because they are using the source article as the basis of their comments.

And just to spell it out, VRR is required on the driver, and required on the monitor too! Don’t forget that. Otherwise the power consumption results don’t change.
Some of us have experience to lean on with the 79##Xt# And also read that at source, really, and yesterday too (from somewhere?!?).

My idle and most is significantly lower than that too.

And as I said the person sat or not in front, can configure for boiling ocean's or NOT.
Posted on Reply
#60
Makaveli
Dave65Everyone says never shut PC down which I think is bullshit, I shut down and turn power strip off when not in use.
people use to say this alot when PC's were using Hard drives for storage prior to the SATA SSD era.

The constant shutting down and powering up does put stress on hard drives and its why they recommend you leave the system runnning. However these days my current machine has no spinning drives they live in a NAS. So my machine is either on or in sleep mode until I need it.
Posted on Reply
#61
ZoneDymo
Vayra86Yep... you don't need to be rich to be gaming on high end PCs - or at least, highly effective gaming configs... honestly. I've been doing that math for years, if you play it smart and don't upgrade for every fart, this is a rather cheap hobby. Games included. All it takes is patience. Patience to wait for games to reach the budget bin, patience to jump on a great deal for a part. Patience on the PC pays out bigtime, not only is it cheap, but your stuff comes bug ridden and feature complete too.
dude compared to like sports or cars or horses...gaming is DIRT cheap
Posted on Reply
#62
Makaveli
ZoneDymodude compared to like sports or cars or horses...gaming is DIRT cheap
yup even as expensive as a 4090 is. Try buying car parts!!

Computing is a relatively cheap hobby when you start looking at the rest of them.
Posted on Reply
#63
Wirko
bugThis would indicate the GPU cannot decouple itself from the refresh rate of the monitor. If it can force it below 60Hz, it will lower the power draw. If it can't, it will just suck juice.
So it's not a fix, just a circumstance where the problem hides itself. Good find if you happen to own a VRR monitor though.
The GPU can't "decouple itself" from the refresh rate, it has work to do with every pixel, every time it's sent over the video cable. It can just put itself in a lower-power (but still active) state. If these states aren't properly and intelligently managed, then violà, kilowatts.

With that said, the Radeons and the 4080 have large caches and should fit entire frame buffers in them (47.5 MiB is required for two 4k monitors at 8-bit colour depth). The RAM, memory bus, and memory controller could remain in a slow clock, low voltage mode when working in 2D. I'm not sure which cards can do that but the high consumption indicates that they don't do that.
Posted on Reply
#64
lemonadesoda
I think what @bug is referring to is the decoupling of the
GRAPHICS CARD=
Video Memory + GPU + Framebuffer memory + Video Controller I/O + Connectors ---> Display

The decoupling of the Video Controller I/O from the GPU, such that the GPU can idle, whereas the Video Controller I/O is synced to the Display.

Bug's statement is obvious from old school graphics where the CPU did the graphics heavy lifting to a shared framebuffer that the Video Controller I/O accessed independently via DMI and offered different display output formats at different framesyncs.

In a modern Graphics Card these steps are not on discrete silicon, separate chips, and therefore the "GPU" is doing much more than in yesteryear, and therefore we require "features" like G-sync and VRR as driver driven functionality and Display compatibility, whereas in the past the output was more standardised maybe call it dumb, and the Video Controller I/O handled that standardised output DECOUPLED from the performance of the CPU/GPU.

Well at least that's what I think bug meant!
Posted on Reply
#65
AusWolf
lemonadesodaI think what @bug is referring to is the decoupling of the
GRAPHICS CARD=
Video Memory + GPU + Framebuffer memory + Video Controller I/O + Connectors ---> Display

The decoupling of the Video Controller I/O from the GPU, such that the GPU can idle, whereas the Video Controller I/O is synced to the Display.

Bug's statement is obvious from old school graphics where the CPU did the graphics heavy lifting to a shared framebuffer that the Video Controller I/O accessed independently via DMI and offered different display output formats at different framesyncs.

In a modern Graphics Card these steps are not on discrete silicon, separate chips, and therefore the "GPU" is doing much more than in yesteryear, and therefore we require "features" like G-sync and VRR as driver driven functionality and Display compatibility, whereas in the past the output was more standardised maybe call it dumb, and the Video Controller I/O handled that standardised output DECOUPLED from the performance of the CPU/GPU.

Well at least that's what I think bug meant!
I think so too (@bug correct us if we're wrong).

Not to mention, even the Windows desktop and the web browser use your GPU these days. "Idle" isn't exactly the same "idle" it used to be during the Windows XP times. It's more like a low power state that is ideally just enough to provide 2D acceleration when it works right - and a bit more when it doesn't.
Posted on Reply
#66
Wirko
lemonadesodaI think what @bug is referring to is the decoupling of the
GRAPHICS CARD=
Video Memory + GPU + Framebuffer memory + Video Controller I/O + Connectors ---> Display

The decoupling of the Video Controller I/O from the GPU, such that the GPU can idle, whereas the Video Controller I/O is synced to the Display.

Bug's statement is obvious from old school graphics where the CPU did the graphics heavy lifting to a shared framebuffer that the Video Controller I/O accessed independently via DMI and offered different display output formats at different framesyncs.

In a modern Graphics Card these steps are not on discrete silicon, separate chips, and therefore the "GPU" is doing much more than in yesteryear, and therefore we require "features" like G-sync and VRR as driver driven functionality and Display compatibility, whereas in the past the output was more standardised maybe call it dumb, and the Video Controller I/O handled that standardised output DECOUPLED from the performance of the CPU/GPU.

Well at least that's what I think bug meant!
Well, AMD thinks they've optimised everything and more.

HotHardware:
AMD says these architectural improvements are complimented by refinements to the adaptive power management system used in RDNA 2 and a new generation of Infinity Cache. The adaptive power management features tune the GPU’s power usage to match the current workload. This helps GPU components avoid drawing power unnecessarily. AMD Infinity Cache is situated between L3 cache and GDDR6 memory which reduces dependence on the latter. This improves bandwidth and further decreases power consumption.
Maybe they haven't optimised everything yet. The single-monitor (1440p60) idle indeed turns off the GPU. The VRAM is running at 13 MHz, which yields sufficient bandwidth. But in the multi-monitor idle mode, the problem becomes obvious: memory clock can not adapt, it jumps from 13 MHz to 2487 MHz and stays there at all times.

From the TPU 7900 XTX review:


The 7900 XT has exactly same memory clocks but consumption in multi-monitor idle and video playback is 1/6 lower than in the 7900 XTX. It also has 1/6 fewer memory chips and memory controllers. Funny, isn't it? This means that almost all of the 103-105 watts of idle power are funneled into the memory, memory bus and memory controllers, which could run at 30 MHz or 100 MHz or something - if they were able to scale down. It may well be a hardware limitation, unfixable in software!

Then VRR reduces the required data rate, and memory clock can fall back to 13 MHz.
AusWolfNot to mention, even the Windows desktop and the web browser use your GPU these days. "Idle" isn't exactly the same "idle" it used to be during the Windows XP times. It's more like a low power state that is ideally just enough to provide 2D acceleration when it works right - and a bit more when it doesn't.
Sure, there's GPU rendering of web pages, but it only happens once if the displayed content doesn't change or scroll.
Posted on Reply
#67
mechtech
Well common sense. Just look at TPU reviews with 60vsync vs not. More fps = more watts.

it is nice to see this testing and results.
Posted on Reply
#68
ymdhis
AusWolfSure, but if $200 a year is a significant amount, then you won't buy a $1,000 GPU with a $600 monitor, either, will you?
Why not? If you can save $100 a month you can buy a new card in a year. Of course you can only save so much if you minimize your expenditures, and part of that is making sure the power bill is only as high as it has to be.
AusWolfNo because it's pointless. Not to mention it's a bad example, because a car is a car even 10 years after you've bought it, but your high-end PC won't play the latest games a few years later.
Not all games are AAA titles that need a new card, and you don't need everything to run at 4k on ultra so you can play it.
TheinsanegamerNSo, if you live in Hungary and make $600 a month, you need VRR to allow your $1000 GPU to idle 24/7, or the extra $200 in electricity will bankrupt you?

do you understand how utterly ridiculous that sounds?
You are the one being ridiculous here. The problem I was trying to explain is that the increase in power bill is not linear. If you use more electricity beyond a certain limit it starts to skyrocket, so for ex. twice the power usage doesn't make you jump from $25 to $50, it makes you jump from $25 to $200. So if you economize the best you can, which includes things like making sure your computer uses half the power when idle, you can have more money saved up each month... that can go towards new computer parts.
And no, I'm not saying that lower power idle in your computer is all that saves you money, I'm just saying that it can be a not so insignificant part of it. If a gpu uses 40W idle and runs 24/7 that's 28kWh per month which does put a dent on a 210kWh monthly cap.
Posted on Reply
#69
Jism
Idle 6w here with stock stuff. 6700XT and not with this fix.
Posted on Reply
#70
Cheeseball
Not a Potato
Good improvements for the 7900 XT/7900 XTX all around, but AMD still needs to work on optimizing how VCN 4.0 works. They should NOT be using more than 30W decoding video compared to NVDEC and the 6900/6950 XT's VCN 3.0.
Posted on Reply
#71
Vayra86
ymdhisYou are the one being ridiculous here. The problem I was trying to explain is that the increase in power bill is not linear. If you use more electricity beyond a certain limit it starts to skyrocket, so for ex. twice the power usage doesn't make you jump from $25 to $50, it makes you jump from $25 to $200. So if you economize the best you can, which includes things like making sure your computer uses half the power when idle, you can have more money saved up each month... that can go towards new computer parts.
And no, I'm not saying that lower power idle in your computer is all that saves you money, I'm just saying that it can be a not so insignificant part of it. If a gpu uses 40W idle and runs 24/7 that's 28kWh per month which does put a dent on a 210kWh monthly cap.
What the hell are you talking about, you have a usage cap on your energy bill? That's ehhh strange.

And even if you do, if a single appliance will put you over it, what the hell kind of usage cap is that and what else are you doing with it. Is it thát bad in Soviet Russia now or is this some arcane construction with solar and a battery plus a super expensive backup? :D
Posted on Reply
#72
R0H1T
No that's right, even in this part of the world bills skyrocket above 200 (kwh) units of consumption! It's like this ~ up to 200 units you get state subsidy (yes this is what wins you elections!) then 200-300 is about 10% more per unit without any subsidy, so the impact is harder, then above 300(350?) it's 25% more expensive per unit. Kinda like your income tax slabs, I don't remember the exact numbers but this is the way it's structured.
Posted on Reply
#73
bug
WirkoThe GPU can't "decouple itself" from the refresh rate, it has work to do with every pixel, every time it's sent over the video cable. It can just put itself in a lower-power (but still active) state. If these states aren't properly and intelligently managed, then violà, kilowatts.

With that said, the Radeons and the 4080 have large caches and should fit entire frame buffers in them (47.5 MiB is required for two 4k monitors at 8-bit colour depth). The RAM, memory bus, and memory controller could remain in a slow clock, low voltage mode when working in 2D. I'm not sure which cards can do that but the high consumption indicates that they don't do that.
I meant they need to be smart enough to sense they don't need to render 60fps from scratch when nothing happens and put the related resources to sleep.
Posted on Reply
#74
AusWolf
ymdhisWhy not? If you can save $100 a month you can buy a new card in a year. Of course you can only save so much if you minimize your expenditures, and part of that is making sure the power bill is only as high as it has to be.
If you have to make a conscious effort to save that money, then you'd much better save it for more useful expenses, like unexpected medical/dentist appointments, car servicing, fixing your house, rising food prices, a holiday, etc.

If $5-10 a month extra on your power bill is too much, then you have better things to spend that $100 saving on. Or if you don't care about that $100, then why do you care about $5?

Also, what about your fully loaded power consumption which is hundreds of Watts higher on a flagship GPU than on a midrange one? How is that suddenly not a problem?
ymdhisNot all games are AAA titles that need a new card, and you don't need everything to run at 4k on ultra so you can play it.
Then you don't need a flagship GPU with a 4K monitor to begin with. Simple. ;)
Posted on Reply
#75
Wirko
bugI meant they need to be smart enough to sense they don't need to render 60fps from scratch when nothing happens and put the related resources to sleep.
Right, and that describes the solution from the topic title exactly: VRR.

Or the thing I mentioned and apparently hasn't been implemented: variable memory clock, like in the 4080 and 4090. It's not sleeping, just walking instead of running.
Posted on Reply
Add your own comment
Nov 26th, 2024 13:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts