Monday, March 6th 2023
Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs
Without mentioning it in its driver change-log, Intel Graphics has quietly addressed the issue of unusually high power-draw for its Arc A-series GPUs in multi-monitor setups. The older 101.4091 drivers had a typical single-monitor idle power-draw of around 11 W, which would shoot up to 40 W idle in multi-monitor setups. In our own launch-day review of the Arc A770, we logged a 44 W multi-monitor power-draw. Intel now claims that the multi-monitor idle power-draw has been tamed, with the latest 101.4146 drivers the company released last week lowering this down to "8 to 9 W" for multi-monitor setups, and "7 to 8 W" for single-monitor ones.
Sources:
Intel GPU Issue Tracker, VideoCardz
48 Comments on Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs
They don't give up and it's far from perfect ... but who knows ?
I was kindly joking at them when they started with their ARC GPU, and now it's seems they're doing a serious job with it.
In short, No, Intel did not fix high multi-monitor power draw for most multi-monitor users.
It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing
it is nice to see that with current performance (and price) A750 is literally a better pick for 1440p than RX 6650 XT in a "cost per frame" chart
But good its been fixed now, shameful not mentioned though as without public record of the problem the ignorant wont realise the impact on their bills.
If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.
A new A750 is $250.
I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
3080 pulls 15-20w idle on a single 4k screen. But my 6700xt pulls about 7-9w idle driving 2x 4k screens.
Radeon 7000 still has stupid high consumption on dual display doesn't it? They did say a years worth, not 8h a day for a year. But still nowhere close.
If it wasn't a problem they wouldn't of fixed it :laugh:
There may very well be good reasons why some of these issues are never fixed, hardware of software changes that would be too costly or complicated no matter how large their software development teams are. There are plenty of things which have never been addressed by all of these companies, like how Nvidia still doesn't support 8bit color dithering to this very date on their GPUs. I was shocked when I switched to AMD at how less noticeable the color banding was. Why is that ? Supposedly Nvidia has many more resources yet this has never be changed, you buy a 2000$ card that has worse color reproduction than a 200$ one, what excuse is there for that ?
40W on the RX 480
www.techpowerup.com/review/amd-rx-480/22.html
34W on the RX 5600 XT
www.techpowerup.com/review/powercolor-radeon-rx-5600-xt-itx/31.html
85W on the RX 7900 XT
www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html
AMD still releases the latest drivers for the RX 480 despite its age, where Intel Gen11 graphics are years newer but aren't supported anymore. But if Intel does this much better than AMD at graphics fixes, maybe a short support life from Intel is better than a long one from AMD.