• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6000/7000 GPUs Reduce Idle Power Consumption by 81% with VRR Enabled

Joined
Aug 30, 2006
Messages
7,221 (1.08/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
There are people posting here with incredibly poor maths and judegement who cannot multiply out 30W extra usage 24/7 or thereabouts. When the consequential cost is significant their defence is "oh, power off". No. We are not talking about the efficient use of the PC, we are talking about the efficiency of a component.

30W wasted is an appalling waste.

My laptop doesnt do that at desktop idle. My processor doesnt do that at idle.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Inflation reduction act.
Please check this about the presence of a text that may require a reduction in energy consumption by citizens and whether there is no part that affects computers.
Maybe make a thread for your off topic stuff ?!.
There are people posting here with incredibly poor maths and judegement who cannot multiply out 30W extra usage 24/7 or thereabouts. When the consequential cost is significant their defence is "oh, power off". No. We are not talking about the efficient use of the PC, we are talking about the efficiency of a component.

30W wasted is an appalling waste.

My laptop doesnt do that at desktop idle. My processor doesnt do that at idle.
You think or can prove?

30 watts idle while on, , most modern monitors have power saving features, , like the pc attached, ,,, , so after 5/10 idle minutes it's off.

Your being hyperbolic Nvidia and Intel are Not That much better.
 
Last edited:
Joined
Jun 22, 2015
Messages
76 (0.02/day)
Processor AMD R7 3800X EKWB
Motherboard Asus Tuf B450M-Pro µATX +MosfetWB (x2)
Cooling EKWB on CPU + GPU / Heatkiller 60/80 on Mosfets / Black Ice SR-1 240mm
Memory 2x8GB G.Skill DDR4 3200C14 @ ----
Video Card(s) Vega64 EKWB
Storage Samsung 512GB NVMe 3.0 x4 / Crucial P1 1TB NVMe 3.0 x2
Display(s) Asus ProArt 23" 1080p / Acer 27" 144Hz FreeSync IPS
Case Fractal Design Arc Mini R2
Power Supply SeaSonic 850W
Keyboard Ducky One TKL / MX Brown
Considering how fast high-end systems lose their value, and how much that 100 bucks are worth for people on a low budget, doing that is absolutely stupid.


I'm not encouraging it. All I'm saying is that the average gamer (especially one on a low budget) doesn't need a flagship GPU.


Do you do that as a full-time position for $7,200 a year with no other source of income?
Its not up to you to decide who does and doesn't need a flagship GPU.
Some people buy nice things, and keep them for a few years, to make the most of it.

Would you buy a new car every year ?

In the grander scheme of things, reducing idle power for potentially millions of users benefits all of society.
 
Joined
Apr 12, 2013
Messages
7,536 (1.77/day)
30W isn't a you can "scoff it off" easily number, having said that you can save more or a lot more by setting your display to sleep quickly when at idle. At least those with big/high refresh screens, outside of dGPU I'd argue display is now the most power hungry component of a build. Talking about avg consumption for a normal home PC of course, just like your mobile/laptop these days.
 
Joined
Mar 7, 2010
Messages
989 (0.18/day)
Location
Michigan
System Name Daves
Processor AMD Ryzen 3900x
Motherboard AsRock X570 Taichi
Cooling Enermax LIQMAX III 360
Memory 32 GiG Team Group B Die 3600
Video Card(s) Powercolor 5700 xt Red Devil
Storage Crucial MX 500 SSD and Intel P660 NVME 2TB for games
Display(s) Acer 144htz 27in. 2560x1440
Case Phanteks P600S
Audio Device(s) N/A
Power Supply Corsair RM 750
Mouse EVGA
Keyboard Corsair Strafe
Software Windows 10 Pro
Instead of leaving the computer at idle, maybe one could simply turn the computer off and turn off the power strip.
Everyone says never shut PC down which I think is bullshit, I shut down and turn power strip off when not in use.
 
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Its not up to you to decide who does and doesn't need a flagship GPU.
Some people buy nice things, and keep them for a few years, to make the most of it.
Well, if I earned $7,200 a year, I wouldn't be looking at a $1,000 GPU, or a 4K monitor. Actually, I earn way more than that, and still don't think a 4K display, or a computer to power it is in my range.

If you're on a super low budget and looking for a super-high-end PC for the money you saved with years of hard work, by all means, knock yourself out, but let me have my opinion on it.

Edit: All I'm saying is that a high-end PC depreciates in value so fast that a little increase in your power bill is the least of your concerns.

Would you buy a new car every year ?
No because it's pointless. Not to mention it's a bad example, because a car is a car even 10 years after you've bought it, but your high-end PC won't play the latest games a few years later.
 
Last edited:
Joined
Feb 11, 2009
Messages
5,556 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Everyone says never shut PC down which I think is bullshit, I shut down and turn power strip off when not in use.

could also always atleast use sleep mode, borderline no powerconsumption and up and running again before you monitor is.
 
Joined
Sep 17, 2014
Messages
22,475 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I know people who put 100 bucks aside every month, and then buy a kick ass config every 5 years.
Dont knock people who dont earn much by saying they shouldnt, or wouldnt, buy a high end config.
Thats just plain wrong.
Also, top gpu's havent always been over a grand, its a recent thing, and we shouldnt be encouraging it.
Look at the price of flagship phones now!!
You can get a 2nd hand car for that price...
Yep... you don't need to be rich to be gaming on high end PCs - or at least, highly effective gaming configs... honestly. I've been doing that math for years, if you play it smart and don't upgrade for every fart, this is a rather cheap hobby. Games included. All it takes is patience. Patience to wait for games to reach the budget bin, patience to jump on a great deal for a part. Patience on the PC pays out bigtime, not only is it cheap, but your stuff comes bug ridden and feature complete too.
 
Joined
Aug 30, 2006
Messages
7,221 (1.08/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
I do encourage everyone to visit the original article at https://www.computerbase.de/2023-07/grafikkarten-leistungsaufnahme-idle-2023/

If the German is too difficult, use your browser’s inbuilt translator.

Go look at the original graphs. There is a lot of information there.
Go read the comments. They are better informed than many here, because they are using the source article as the basis of their comments.

And just to spell it out, VRR is required on the driver, and required on the monitor too! Don’t forget that. Otherwise the power consumption results don’t change.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I just make sure it Never idles, present break from F@H due to holiday not withstanding :)
I do encourage everyone to visit the original article at https://www.computerbase.de/2023-07/grafikkarten-leistungsaufnahme-idle-2023/

If the German is too difficult, use your browser’s inbuilt translator.

Go look at the original graphs. There is a lot of information there.
Go read the comments. They are better informed than many here, because they are using the source article as the basis of their comments.

And just to spell it out, VRR is required on the driver, and required on the monitor too! Don’t forget that. Otherwise the power consumption results don’t change.
Some of us have experience to lean on with the 79##Xt# And also read that at source, really, and yesterday too (from somewhere?!?).

My idle and most is significantly lower than that too.

And as I said the person sat or not in front, can configure for boiling ocean's or NOT.
 
Joined
Feb 21, 2006
Messages
2,225 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Everyone says never shut PC down which I think is bullshit, I shut down and turn power strip off when not in use.
people use to say this alot when PC's were using Hard drives for storage prior to the SATA SSD era.

The constant shutting down and powering up does put stress on hard drives and its why they recommend you leave the system runnning. However these days my current machine has no spinning drives they live in a NAS. So my machine is either on or in sleep mode until I need it.
 
Joined
Feb 11, 2009
Messages
5,556 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Yep... you don't need to be rich to be gaming on high end PCs - or at least, highly effective gaming configs... honestly. I've been doing that math for years, if you play it smart and don't upgrade for every fart, this is a rather cheap hobby. Games included. All it takes is patience. Patience to wait for games to reach the budget bin, patience to jump on a great deal for a part. Patience on the PC pays out bigtime, not only is it cheap, but your stuff comes bug ridden and feature complete too.

dude compared to like sports or cars or horses...gaming is DIRT cheap
 
Joined
Feb 21, 2006
Messages
2,225 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
dude compared to like sports or cars or horses...gaming is DIRT cheap
yup even as expensive as a 4090 is. Try buying car parts!!

Computing is a relatively cheap hobby when you start looking at the rest of them.
 
Joined
Jan 3, 2021
Messages
3,506 (2.46/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
This would indicate the GPU cannot decouple itself from the refresh rate of the monitor. If it can force it below 60Hz, it will lower the power draw. If it can't, it will just suck juice.
So it's not a fix, just a circumstance where the problem hides itself. Good find if you happen to own a VRR monitor though.
The GPU can't "decouple itself" from the refresh rate, it has work to do with every pixel, every time it's sent over the video cable. It can just put itself in a lower-power (but still active) state. If these states aren't properly and intelligently managed, then violà, kilowatts.

With that said, the Radeons and the 4080 have large caches and should fit entire frame buffers in them (47.5 MiB is required for two 4k monitors at 8-bit colour depth). The RAM, memory bus, and memory controller could remain in a slow clock, low voltage mode when working in 2D. I'm not sure which cards can do that but the high consumption indicates that they don't do that.
 
Joined
Aug 30, 2006
Messages
7,221 (1.08/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
I think what @bug is referring to is the decoupling of the
GRAPHICS CARD=
Video Memory + GPU + Framebuffer memory + Video Controller I/O + Connectors ---> Display

The decoupling of the Video Controller I/O from the GPU, such that the GPU can idle, whereas the Video Controller I/O is synced to the Display.

Bug's statement is obvious from old school graphics where the CPU did the graphics heavy lifting to a shared framebuffer that the Video Controller I/O accessed independently via DMI and offered different display output formats at different framesyncs.

In a modern Graphics Card these steps are not on discrete silicon, separate chips, and therefore the "GPU" is doing much more than in yesteryear, and therefore we require "features" like G-sync and VRR as driver driven functionality and Display compatibility, whereas in the past the output was more standardised maybe call it dumb, and the Video Controller I/O handled that standardised output DECOUPLED from the performance of the CPU/GPU.

Well at least that's what I think bug meant!
 
Last edited:
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I think what @bug is referring to is the decoupling of the
GRAPHICS CARD=
Video Memory + GPU + Framebuffer memory + Video Controller I/O + Connectors ---> Display

The decoupling of the Video Controller I/O from the GPU, such that the GPU can idle, whereas the Video Controller I/O is synced to the Display.

Bug's statement is obvious from old school graphics where the CPU did the graphics heavy lifting to a shared framebuffer that the Video Controller I/O accessed independently via DMI and offered different display output formats at different framesyncs.

In a modern Graphics Card these steps are not on discrete silicon, separate chips, and therefore the "GPU" is doing much more than in yesteryear, and therefore we require "features" like G-sync and VRR as driver driven functionality and Display compatibility, whereas in the past the output was more standardised maybe call it dumb, and the Video Controller I/O handled that standardised output DECOUPLED from the performance of the CPU/GPU.

Well at least that's what I think bug meant!
I think so too (@bug correct us if we're wrong).

Not to mention, even the Windows desktop and the web browser use your GPU these days. "Idle" isn't exactly the same "idle" it used to be during the Windows XP times. It's more like a low power state that is ideally just enough to provide 2D acceleration when it works right - and a bit more when it doesn't.
 
  • Like
Reactions: bug
Joined
Jan 3, 2021
Messages
3,506 (2.46/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
I think what @bug is referring to is the decoupling of the
GRAPHICS CARD=
Video Memory + GPU + Framebuffer memory + Video Controller I/O + Connectors ---> Display

The decoupling of the Video Controller I/O from the GPU, such that the GPU can idle, whereas the Video Controller I/O is synced to the Display.

Bug's statement is obvious from old school graphics where the CPU did the graphics heavy lifting to a shared framebuffer that the Video Controller I/O accessed independently via DMI and offered different display output formats at different framesyncs.

In a modern Graphics Card these steps are not on discrete silicon, separate chips, and therefore the "GPU" is doing much more than in yesteryear, and therefore we require "features" like G-sync and VRR as driver driven functionality and Display compatibility, whereas in the past the output was more standardised maybe call it dumb, and the Video Controller I/O handled that standardised output DECOUPLED from the performance of the CPU/GPU.

Well at least that's what I think bug meant!
Well, AMD thinks they've optimised everything and more.

HotHardware:
AMD says these architectural improvements are complimented by refinements to the adaptive power management system used in RDNA 2 and a new generation of Infinity Cache. The adaptive power management features tune the GPU’s power usage to match the current workload. This helps GPU components avoid drawing power unnecessarily. AMD Infinity Cache is situated between L3 cache and GDDR6 memory which reduces dependence on the latter. This improves bandwidth and further decreases power consumption.

Maybe they haven't optimised everything yet. The single-monitor (1440p60) idle indeed turns off the GPU. The VRAM is running at 13 MHz, which yields sufficient bandwidth. But in the multi-monitor idle mode, the problem becomes obvious: memory clock can not adapt, it jumps from 13 MHz to 2487 MHz and stays there at all times.

From the TPU 7900 XTX review:
1690925034449.png

The 7900 XT has exactly same memory clocks but consumption in multi-monitor idle and video playback is 1/6 lower than in the 7900 XTX. It also has 1/6 fewer memory chips and memory controllers. Funny, isn't it? This means that almost all of the 103-105 watts of idle power are funneled into the memory, memory bus and memory controllers, which could run at 30 MHz or 100 MHz or something - if they were able to scale down. It may well be a hardware limitation, unfixable in software!

Then VRR reduces the required data rate, and memory clock can fall back to 13 MHz.

Not to mention, even the Windows desktop and the web browser use your GPU these days. "Idle" isn't exactly the same "idle" it used to be during the Windows XP times. It's more like a low power state that is ideally just enough to provide 2D acceleration when it works right - and a bit more when it doesn't.
Sure, there's GPU rendering of web pages, but it only happens once if the displayed content doesn't change or scroll.
 
Joined
Dec 26, 2006
Messages
3,837 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Well common sense. Just look at TPU reviews with 60vsync vs not. More fps = more watts.

it is nice to see this testing and results.
 
Joined
Jul 29, 2022
Messages
514 (0.60/day)
Sure, but if $200 a year is a significant amount, then you won't buy a $1,000 GPU with a $600 monitor, either, will you?
Why not? If you can save $100 a month you can buy a new card in a year. Of course you can only save so much if you minimize your expenditures, and part of that is making sure the power bill is only as high as it has to be.

No because it's pointless. Not to mention it's a bad example, because a car is a car even 10 years after you've bought it, but your high-end PC won't play the latest games a few years later.
Not all games are AAA titles that need a new card, and you don't need everything to run at 4k on ultra so you can play it.

So, if you live in Hungary and make $600 a month, you need VRR to allow your $1000 GPU to idle 24/7, or the extra $200 in electricity will bankrupt you?

do you understand how utterly ridiculous that sounds?
You are the one being ridiculous here. The problem I was trying to explain is that the increase in power bill is not linear. If you use more electricity beyond a certain limit it starts to skyrocket, so for ex. twice the power usage doesn't make you jump from $25 to $50, it makes you jump from $25 to $200. So if you economize the best you can, which includes things like making sure your computer uses half the power when idle, you can have more money saved up each month... that can go towards new computer parts.
And no, I'm not saying that lower power idle in your computer is all that saves you money, I'm just saying that it can be a not so insignificant part of it. If a gpu uses 40W idle and runs 24/7 that's 28kWh per month which does put a dent on a 210kWh monthly cap.
 
Last edited:

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
Good improvements for the 7900 XT/7900 XTX all around, but AMD still needs to work on optimizing how VCN 4.0 works. They should NOT be using more than 30W decoding video compared to NVDEC and the 6900/6950 XT's VCN 3.0.
 
Joined
Sep 17, 2014
Messages
22,475 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You are the one being ridiculous here. The problem I was trying to explain is that the increase in power bill is not linear. If you use more electricity beyond a certain limit it starts to skyrocket, so for ex. twice the power usage doesn't make you jump from $25 to $50, it makes you jump from $25 to $200. So if you economize the best you can, which includes things like making sure your computer uses half the power when idle, you can have more money saved up each month... that can go towards new computer parts.
And no, I'm not saying that lower power idle in your computer is all that saves you money, I'm just saying that it can be a not so insignificant part of it. If a gpu uses 40W idle and runs 24/7 that's 28kWh per month which does put a dent on a 210kWh monthly cap.
What the hell are you talking about, you have a usage cap on your energy bill? That's ehhh strange.

And even if you do, if a single appliance will put you over it, what the hell kind of usage cap is that and what else are you doing with it. Is it thát bad in Soviet Russia now or is this some arcane construction with solar and a battery plus a super expensive backup? :D
 
Joined
Apr 12, 2013
Messages
7,536 (1.77/day)
No that's right, even in this part of the world bills skyrocket above 200 (kwh) units of consumption! It's like this ~ up to 200 units you get state subsidy (yes this is what wins you elections!) then 200-300 is about 10% more per unit without any subsidy, so the impact is harder, then above 300(350?) it's 25% more expensive per unit. Kinda like your income tax slabs, I don't remember the exact numbers but this is the way it's structured.
 

bug

Joined
May 22, 2015
Messages
13,786 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
The GPU can't "decouple itself" from the refresh rate, it has work to do with every pixel, every time it's sent over the video cable. It can just put itself in a lower-power (but still active) state. If these states aren't properly and intelligently managed, then violà, kilowatts.

With that said, the Radeons and the 4080 have large caches and should fit entire frame buffers in them (47.5 MiB is required for two 4k monitors at 8-bit colour depth). The RAM, memory bus, and memory controller could remain in a slow clock, low voltage mode when working in 2D. I'm not sure which cards can do that but the high consumption indicates that they don't do that.
I meant they need to be smart enough to sense they don't need to render 60fps from scratch when nothing happens and put the related resources to sleep.
 
Joined
Jan 14, 2019
Messages
12,344 (5.75/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Why not? If you can save $100 a month you can buy a new card in a year. Of course you can only save so much if you minimize your expenditures, and part of that is making sure the power bill is only as high as it has to be.
If you have to make a conscious effort to save that money, then you'd much better save it for more useful expenses, like unexpected medical/dentist appointments, car servicing, fixing your house, rising food prices, a holiday, etc.

If $5-10 a month extra on your power bill is too much, then you have better things to spend that $100 saving on. Or if you don't care about that $100, then why do you care about $5?

Also, what about your fully loaded power consumption which is hundreds of Watts higher on a flagship GPU than on a midrange one? How is that suddenly not a problem?

Not all games are AAA titles that need a new card, and you don't need everything to run at 4k on ultra so you can play it.
Then you don't need a flagship GPU with a 4K monitor to begin with. Simple. ;)
 
Last edited:
Top