I did some more testing using this video:
The native resolution of my monitor is 1440p@144Hz.
Results:
Video Mode | SDR (HDR disabled in Windows) | HDR |
---|
4kp60 | 56W | 59W |
1440p60 | 42W | 57W |
1080p60 | 41W | 44W - 57W |
720p60 | 41W | 43W |
HDR 1080p60 is somewhat strange, the power consumption jumps between 44W (which would be in line with the other results) and 57W. This is due to the memory clock also jumping between 200MHz and 900MHz. It looks like the card can not decide which power state to use for the VRAM in that mode.
I think this is at least somewhat related to high-tier cards having more VRAM. Decoding video can be quite memory bandwidth intensive, just the raw RGB output data of the decoder at 4kp60 10bit is 3840 * 2160 * 3 * 10bit * 60Hz / 8 bit/byte / 2^30 byte/GB = 1,74 GB/s.
The problem is, that you can not disable parts of the VRAM, all of it has to be powered, even if you only use 10% of it. You can clock it down, but it will still draw
some power and usually the same clock will apply for all DRAM chips. Same applies for the memory controllers and caches connected to those parts of the VRAM, they also have to be powered all the time. The VRAM clock required to decode a video (or better say the memory bandwidth) is the same regardless of how much VRAM you have, for an example if you need 900 MHz to decode 4K video, you will have to clock all your VRAM at 900 MHz regardless of your card having 4 or 24 GB of VRAM, and the card with 24 GB will draw a lot more power than the 4 GB card.
Nvidia overall does better in that regard, but you can still see that their high end cards (with more VRAM) also need more power.
But this is just my guess anyway