• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4080 vs 7900XTX power consumption - Optimum Tech

Agreed from what I can see the majority of single monitor setups are fixed. The issue remains for certain combinations of dual monitor configurations.


I also tried swapping cables brought brand new one from amazon and they made do difference prior to this driver update. I still saw 50 watts on idle with them.

These are the cables I picked up here.

View attachment 305723
I am not disputing that this driver update has drastically improved idle performance but I am thinking about people who trust the cable that comes with the monitor.
 
60hz on a 144hz monitor is really not a great way to roll. If I ever cap framerates I do it at half refresh. 60hz isnt a fraction thats beneficial to sync to refresh. Alternatively you could dial down the monitor itself to 120hz.

Most if not all 144hz monitors probably have VRR, you can just cap the framerate to 60 or any other arbitrary number and it will be fine, you don't have to use refresh rates that evenly divide the max refresh rate.
 
I am not disputing that this driver update has drastically improved idle performance but I am thinking about people who trust the cable that comes with the monitor.
You are not wrong people should definely make sure you test cables and not just rely on what comes in the box while troubleshooting.
 
That strongly suggest that it's a monitor specific issue.
It's not. When a video card cannot determine it's safe to throttle back the GPU or VRAM, it keeps them running faster, just to be on the safe side. Most times, it's the VRAM that runs full speed.
It's not an issue with the monitor (unless the monitor somehow communicated faulty data - but then again, if it does, how did it get its "FreeSync" logo?), it's an issue with the driver not dealing properly with all the data available.
 
It's not. When a video card cannot determine it's safe to throttle back the GPU or VRAM, it keeps them running faster, just to be on the safe side. Most times, it's the VRAM that runs full speed.
It's not an issue with the monitor (unless the monitor somehow communicated faulty data - but then again, if it does, how did it get its "FreeSync" logo?), it's an issue with the driver not dealing properly with all the data available.
Its mostly a matter of power states in my experience, how finely these are tuned and how they relate to the current load at a desktop refresh rate / resolution. The gpu will sit on the safe side to avoid any latency issues.

This is why I suggest to drop the monitor refresh to a lower level too, you might just be able to nudge the gpu into a more sane power state (= vram freq) at say 120hz on that mentioned 144hz 4K monitor.

Most if not all 144hz monitors probably have VRR, you can just cap the framerate to 60 or any other arbitrary number and it will be fine, you don't have to use refresh rates that evenly divide the max refresh rate.
Freesync on isnt always a preferable setup especially on the desktop. Especially if monitors arent top end, but midrange, you havent always got access to BFI or the desired brightness setting on top of Freesync. My UW is an example of that. On top of that, variable refresh also results in variable input latency. My experience especially in gaming is that a steady refresh rate + FPS rate yields the best results, even if VRR is a great tool to have.
 
Last edited:
Its mostly a matter of power states in my experience, how finely these are tuned and how they relate to the current load at a desktop refresh rate / resolution. The gpu will sit on the safe side to avoid any latency issues.

This is why I suggest to drop the monitor refresh to a lower level too, you might just be able to nudge the gpu into a more sane power state (= vram freq) at say 120hz on that mentioned 144hz 4K monitor.
I have usually heard about this issue in conjunction with a dual or multi-monitor setup. Though one monitor having a variable refresh rate might present the same challenges, I guess.

In this particular case, Nvidia and their more tightly speced G-sync, do seem to have the upper hand.
 
I have usually heard about this issue in conjunction with a dual or multi-monitor setup. Though one monitor having a variable refresh rate might present the same challenges, I guess.

In this particular case, Nvidia and their more tightly speced G-sync, do seem to have the upper hand.
Yeah the high idle power consumption happens more often with multi-monitor setups, for example:

1689985141205.png


AW3423DWF - 3440x1440 @ 165 Hz
OLED42C2PUA @ 3840 x 21660 @ 120 Hz

If I run either one of them at 60 Hz while keeping the opposite at its maximum refresh rate, its perfectly fine:

1689987439405.png


Both are with Firefox and Edge running (but no videos loaded in any tab) with Discord minimized (in an active VC) in the background.

AMD is getting there. We just need sub 40W idle (or sub 60W if 3+ displays) with all monitors at their proper refresh rate and we'll be good to go.

EDIT: I removed a duplicate image.
 
Last edited:
Low quality post by Assimilator
Yeah the high idle power consumption happens more often with multi-monitor setups, for example:

View attachment 305794

AW3423DWF - 3440x1440 @ 165 Hz
OLED42C2PUA @ 3840 x 21660 @ 120 Hz

If I run either one of them at 60 Hz while keeping the opposite at its maximum refresh rate, its perfectly fine:

View attachment 305808

Both are with Firefox and Edge running (but no videos loaded in any tab) with Discord minimized (in an active VC) in the background.

AMD is getting there. We just need sub 40W idle (or sub 60W if 3+ displays) with all monitors at their proper refresh rate and we'll be good to go.

EDIT: I removed a duplicate image.
Thank you for doing the basic modicum of testing that AMD is apparently completely incapable of.
 
They were lower wattage, but also lower performance, so not really.

From the latest GPU review tested with 2023.2 bench.

View attachment 305435

6800 is in a good spot since it's lower clocked, but the rest are the same or worse than equivalent NVIDIA series.

It's more impressive when you consider that RDNA2 was on TSMC 7 nm and Ampere was on Samsung 8 (10 nm).
It's so stupid because the stock voltages are insanely high

I can lose 50Mhz and shave 150W off my GPU, or lose ~110W at the same boost. They went insanely hard to push them to stupid levels.

Most if not all 144hz monitors probably have VRR, you can just cap the framerate to 60 or any other arbitrary number and it will be fine, you don't have to use refresh rates that evenly divide the max refresh rate.
This is correct, if you cap within the VRR range you're good to go thanks to the large vertical blanking total being used - so 60FPS on 144Hz still runs with the timings of 144Hz, it just blanks the backlight during the spare time resulting in a fairly good image that's still displayed faster

I don't have the correct wording on the top of my head, but you can think of it as getting the display latency of your max refresh rate, anywhere within that VRR range (so 48FPS on a display that does 48-144Hz, you're getting the display latency of 144Hz)

It gets weird when below that range, 47FPS doubles to 94Hz but the duplication adds a frame of latency so 48FPS/Hz (20.83ms) displays faster than 47FPS/94Hz

94Hz = 10.6ms (times two) for 21.2ms *(see below)
48FPS is ready in 20.83ms

144Hz is 6.94ms

6.94 - 13.88 - 20.82 - 27.76ms

both would end up being displayed at the same time here, in that fourth refresh cycle - but as far as the system responsiveness goes that means the CPU and GPU were asked to render the next frame later in that fourth cycle (or 2nd/3rd with pre-rendered frames) so the next frame is 20.82ms/13.88ms/etc newer than a traditional Vsync cycle, so it feels a lot less laggy.

*Again here, because of the way the adaptive sync tech works the CPU and GPU are asked to render a new frame based on the second ones timing - so you're getting a duplicated frame on the display, but the CPU and GPU aren't waiting that extra time, they act based on the second cycle. This is why VRR duplicate frames dont add latency like DLSS3 or those motion smoothing modes on a TV.


It gets weird because many displays have motion smearing and overshoot at lower refresh rates so 47FPS at 94Hz could be picture perfect beauty, but 48FPS while technically faster adds in the overshoot and issues. My monitor has this, reviews point out that it's best overdrive mode causes artifacts below 90Hz.
 
Back
Top