Monday, April 26th 2021

What AMD Didn't Tell Us: 21.4.1 Drivers Improve Non-Gaming Power Consumption By Up To 72%

AMD's recently released Radeon Software Adrenalin 21.4.1 WHQL drivers lower non-gaming power consumption, our testing finds. AMD did not mention these reductions in the changelog of its new driver release. We did a round of testing, comparing the previous 21.3.2 drivers, with 21.4.1, using Radeon RX 6000 series SKUs, namely the RX 6700 XT, RX 6800, RX 6800 XT, and RX 6900 XT. Our results show significant power-consumption improvements in certain non-gaming scenarios, such as system idle and media playback.

The Radeon RX 6700 XT shows no idle power draw reduction; but the RX 6800, RX 6800 XT, and RX 6900 XT posted big drops in idle power consumption, at 1440p, going down from 25 W to 5 W (down by about 72%). There are no changes with multi-monitor. Media playback power draw sees up to 30% lower power consumption for the RX 6800, RX 6800 XT, and RX 6900 XT. This is a huge improvement for builders of media PC systems, as not only power is affected, but heat and noise, too.
Why AMD didn't mention these huge improvements is anyone's guess, but a closer look at the numbers could drop some hints. Even with media playback power draw dropping from roughly 50 W to 35 W, the RX 6800/6900 series chips still end up using more power than competing NVIDIA GeForce RTX 30-series SKUs. The RTX 3070 pulls 18 W, while the RTX 3080 does 27 W, both of which are lower. We tested the driver on the older-generation RX 5700 XT, and saw no changes. Radeon RX 6700 XT already had very decent power consumption in these states, so our theory is that for the Navi 22 GPU on the RX 6700 XT AMD improved certain power consumption shortcomings that were found after RX 6800 release. Since those turned out to be stable, they were backported to the Navi 21-based RX 6800/6900 series, too.
Add your own comment

63 Comments on What AMD Didn't Tell Us: 21.4.1 Drivers Improve Non-Gaming Power Consumption By Up To 72%

#1
R0H1T
Isn't it obvious? It's because they're the good guys :D
Posted on Reply
#2
ultravy
How about fixing 165hz refresh on monitors, memory clock stays at full speed! Changing to 144 fixes the problem!
Posted on Reply
#3
napata
Maybe they didn't brag about it because the numbers before the driver were absolutely terrible and they just fixed it. 25+W for full idle means there's something that's not working properly imo.
Posted on Reply
#4
randompeep
R0H1TIsn't it obvious? It's because they're the good guys :D
Yeah man, they truly care about the consumers! Now these bimmy boys' 850W power supplies make sense /s
Posted on Reply
#5
Space Lynx
Astronaut
AMD does what Nvidia doesn't ~ 7nm nom nom bois!!!
ultravyHow about fixing 165hz refresh on monitors, memory clock stays at full speed! Changing to 144 fixes the problem!
did you report it through amd software? i did. and i know one other did. the more who report this issue the sooner it gets fixed. you can't complain unless you do the bug report tool in amd software.
Posted on Reply
#6
londiste
What changed to drop the power consumption, lowered clocks on GPU/VRAM?
Posted on Reply
#7
rippie
finally a driver that fixes my idle screen artifacts on my ryzen 4750G.
goo AMD (normally you'd expect this to work at launch, but a year after: i take it!)
Posted on Reply
#8
RedelZaVedno
There are no changes with multi-monitor.

Why is this still happening? I use 4 screen setup and measure no difference between 1 and 4 monitors plugged in good old EVGA 1080TI (10W) but my gigabyte 5700XT draws +50W in idle with the same 4 screen setup. I wanna keep my power draw to a minimum when not gaming so I can have my open case system completely passively cooled when in idle (I hate PC noise). While 1080TI fans never turn when in idle and case is open, 5700XT fans do turn on from time to time and are quite audible. AMD should fix this ever persisting problem with their GPUs a long time ago. It should not be all that hard to fix it, given the fact that Nvidia has nailed it 6 years ago on much bigger die.
Posted on Reply
#9
DeeJay1001
Haven't updated yet but hopefully the issue with memory clocks unable to idle with a 144hz display have been resolved. Other than that my Aorus Master version of the 6800XT has been one of the most stable, non-issue GPUs I've ever owned. It crushes every game out there and it does so quietly with low power consumption.
Posted on Reply
#10
las
lynx29AMD does what Nvidia doesn't ~ 7nm nom nom bois!!!



did you report it through amd software? i did. and i know one other did. the more who report this issue the sooner it gets fixed. you can't complain unless you do the bug report tool in amd software.
Nvidia had this stuff working for ages, + proper 2d clocks using 144+ Hz (Nvidia fixed this years ago)

Simply look up powerconsumption in earlier reviews to verify, AMD used way to much power outside of 3d, always been the case

So I'm not sure what you are celebrating
Posted on Reply
#11
londiste
lasNvidia had this stuff working for ages, + proper 2d clocks using 144+ Hz (Nvidia fixed this years ago)
Both Nvidia and AMD have been fixing this several times. Not only for new generation but it does seem to break every now and then. Multimonitor is worse.
Posted on Reply
#12
Space Lynx
Astronaut
lasNvidia had this stuff working for ages, + proper 2d clocks using 144+ Hz (Nvidia fixed this years ago)

Simply look up powerconsumption in earlier reviews to verify, AMD used way to much power outside of 3d, always been the case

So I'm not sure what you are celebrating
so salty bruh
Posted on Reply
#13
ratirt
londisteWhat changed to drop the power consumption, lowered clocks on GPU/VRAM?
Maybe Vram since the GPU will drop to as low as 0Mhz when idle. So I guess the fixes are somewhere else since the 0Mhz says it all. Maybe optimization of the power delivery or something.
Posted on Reply
#14
1d10t
TBH 20W doens't make any difference, still love 'em. Besides I already applied undervolt.
nVidia might be better, because they knew most of its users are sticking with power hungry CPU's :rolleyes:
Posted on Reply
#15
Vya Domus
RedelZaVedno There are no changes with multi-monitor.

Why is this still happening? I use 4 screen setup and measure no difference between 1 and 4 monitors plugged in good old EVGA 1080TI (10W) but my gigabyte 5700XT draws +50W in idle with the same 4 screen setup. I wanna keep my power draw to a minimum when not gaming so I can have my open case system completely passively cooled when in idle (I hate PC noise). While 1080TI fans never turn when in idle and case is open, 5700XT fans do turn on from time to time and are quite audible. AMD should fix this ever persisting problem with their GPUs a long time ago. It should not be all that hard to fix it, given the fact that Nvidia has nailed it 6 years ago on much bigger die.
You have to understand that this can't always be fixed. The cards ramp up clocks based on certain triggers and workloads, my 1080 ramps up memory clocks for no obvious reason all the time.
Posted on Reply
#16
birdie
lynx29AMD does what Nvidia doesn't ~ 7nm nom nom bois!!!



did you report it through amd software? i did. and i know one other did. the more who report this issue the sooner it gets fixed. you can't complain unless you do the bug report tool in amd software.
AMD normally addresses bugs only when they are inundated by bug reports not only from users but from prominent tech figures, i.e. media outlets or youtubers.

E.g. they removed gamma/proper color configuration in Radeon Drivers five over five years ago. The topic has been raised numerous times. Have they done anything? F no.

Don't overestimate their eagerness to fix anything unless it's burning under them. Oh, that's funny, I'm actually ignoring you.

Oh, and NVIDIA and Intel are no different unfortunately.
Posted on Reply
#17
Steevo
birdieAMD normally addresses bugs only when they are inundated by bug reports not only from users but from prominent tech figures, i.e. media outlets or youtubers.

E.g. they removed gamma/proper color configuration in Radeon Drivers five over five years ago. The topic has been raised numerous times. Have they done anything? F no.

Don't overestimate their eagerness to fix anything unless it's burning under them. Oh, that's funny, I'm actually ignoring you.

Oh, and NVIDIA and Intel are no different unfortunately.
Exactly, where are the video acceleration controls at? Now we get a couple choices that feel like they are made for Kindergartners.

I used to get into the registry and XML files to change it but recently they get overwritten on reboot.
Posted on Reply
#18
londiste
Vya DomusYou have to understand that this can't always be fixed. The cards ramp up clocks based on certain triggers and workloads, my 1080 ramps up memory clocks for no obvious reason all the time.
There probably is an obvious reason just maybe not too straightforward to notice. As long as you have some software running that uses GPU it can and probably will regularly wake it up and run at a higher clock bin. If you happen to have monitoring software running or a browser open - browsers are all using HW acceleration for stuff including video decoding these days - ramping up clocks is expected.
RedelZaVednoWhy is this still happening? I use 4 screen setup and measure no difference between 1 and 4 monitors plugged in good old EVGA 1080TI (10W) but my gigabyte 5700XT draws +50W in idle with the same 4 screen setup.
Multimonitor is not a trivial use case. Display controller and memory need higher clocks when pushing through more pixels to monitors. This also means the higher resolution and refresh rate(s) you have on your monitor(s) the more complex is becomes. Plus, mismatched resolutions and refresh rates seem to be a bit of a problem on their own.

For example - in theory 2160p monitor should need the same clocks as 4*1080p monitors - in reality there is obviously some overhead in running more monitors.
I have two monitors - 1440p@165Hz and 1440p@60Hz. GPU drivers do not really seem to figure out nice low clocks for this combination. I have had the same monitor setup over over last 5-6 GPUs and all of them have been running at increased clocks at one point or another. Currently using Nvidia GPU and the last fix for this was somewhere middle of last year if I remember correctly.
Posted on Reply
#19
1d10t
birdieE.g. they removed gamma/proper color configuration in Radeon Drivers five over five years ago. The topic has been raised numerous times. Have they done anything? F no.
I think this is a good decision because currently there are no monitor that support DDC / CI anymore, most of them support VRR with their own MPRT, including noise / blur reduction which all makes overriding at the GPU level meaningless. In the future, DSC will become increasingly popular, with different techniques for each manufacturer, whether chroma subsampling, primary and secondary separation or any other type of compression will render this feature completely useless.
Posted on Reply
#20
birdie
1d10tI think this is a good decision because currently there are no monitor that support DDC / CI anymore, most of them support VRR with their own MPRT, including noise / blur reduction which all makes overriding at the GPU level meaningless. In the future, DSC will become increasingly popular, with different techniques for each manufacturer, whether chroma subsampling, primary and secondary separation or any other type of compression will render this feature completely useless.
I'm talking about something completely different:




This is how it looked in old Radeon Drivers:

Amd/comments/etfm7a (that's for video)

Or this:



And now we have just this: www.amd.com/en/support/kb/faq/dh3-021
Posted on Reply
#21
RedelZaVedno
londisteThere probably is an obvious reason just maybe not too straightforward to notice. As long as you have some software running that uses GPU it can and probably will regularly wake it up and run at a higher clock bin. If you happen to have monitoring software running or a browser open - browsers are all using HW acceleration for stuff including video decoding these days - ramping up clocks is expected.

Multimonitor is not a trivial use case. Display controller and memory need higher clocks when pushing through more pixels to monitors. This also means the higher resolution and refresh rate(s) you have on your monitor(s) the more complex is becomes. Plus, mismatched resolutions and refresh rates seem to be a bit of a problem on their own.

For example - in theory 2160p monitor should need the same clocks as 4*1080p monitors - in reality there is obviously some overhead in running more monitors.
I have two monitors - 1440p@165Hz and 1440p@60Hz. GPU drivers do not really seem to figure out nice low clocks for this combination. I have had the same monitor setup over over last 5-6 GPUs and all of them have been running at increased clocks at one point or another. Currently using Nvidia GPU and the last fix for this was somewhere middle of last year if I remember correctly.
The problem is I'm getting around 50W with 5700XT in idle (not using anything except some background apps like kaspersky, steam/epic). There really is no logical reason for such consumption.

Miss-match in resolutions and frequencies between monitors could well be the problem, but I use 4x1920*1200p 60Hz identical IPS monitors, so resolution-frequency miss-match should be ruled out at least in my case.
I still believe this could probably be fixed on driver level, but it might be some architectural under optimization since Polaris, RNDA1 and RDNA2 all suffer from this overconsumption problem but not VEGA arch based GPUs 56/64/RVII and also NVidia's GPUs seem not to suffer from it since releasing Kepler arch.
Posted on Reply
#22
1d10t
birdieI'm talking about something completely different:




This is how it looked in old Radeon Drivers:

Amd/comments/etfm7a (that's for video)

Or this:



And now we have just this: www.amd.com/en/support/kb/faq/dh3-021
And I firmly believe that I responded accordingly, Display Data Channel Wikipedia.
Suppose you want manual setup, do you know your monitor supports DDC / CI or you just blindly trust nVidia? Just like GSync Ultimate, graphics card support it, but the monitor is only Compatible, are you sure that works just because control panel says so? As I said, manual settings are useless, as most of them do not conform to ICC profiles, which came from monitor driver, constructed through manual calibration with the X-Rite / Datacolor SpyderX, or just factory pre-calibrated monitor. The idea of manual override is good, but there's no guarantee its value won't change in games or apps with built-in ICC toggles (ie. Adobe, DaVinci).
Posted on Reply
#23
Space Lynx
Astronaut
birdieI'm talking about something completely different:




This is how it looked in old Radeon Drivers:

Amd/comments/etfm7a (that's for video)

Or this:



And now we have just this: www.amd.com/en/support/kb/faq/dh3-021
fyi I have used those Nvidia gamma settings a lot on my gaming laptop (cause it was only way to adjust colors on laptop screen) and it would only enforce the colors in half the games....) nvidia is just as broken as AMD here. its just easier to use a ICC profile and color profile enforcer. nvidia doesn't work either on this. at least not on laptops. fullscreen witcher 3 reset the nvidia color gamut everytime... annoyed crap out of me... so eh your point is moot.
Posted on Reply
#24
Xuper
TPU , Can you check CRU in multi-monitor ? for at least 6800 series.some claims they gain lower memory clock by using CRU
Posted on Reply
#25
shadow3401
Here we go again with the AMD = bad, nVidia = sun shines out of its ass everyday. Now that this MINOR problem of increased power consumption has been identified i'm sure it will be resolved in the 21.4.2 or 21.5.1 driver update so no need to have a mental breakdown over it, move on.
Posted on Reply
Add your own comment
Nov 23rd, 2024 17:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts