Monday, March 6th 2023

Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs

Without mentioning it in its driver change-log, Intel Graphics has quietly addressed the issue of unusually high power-draw for its Arc A-series GPUs in multi-monitor setups. The older 101.4091 drivers had a typical single-monitor idle power-draw of around 11 W, which would shoot up to 40 W idle in multi-monitor setups. In our own launch-day review of the Arc A770, we logged a 44 W multi-monitor power-draw. Intel now claims that the multi-monitor idle power-draw has been tamed, with the latest 101.4146 drivers the company released last week lowering this down to "8 to 9 W" for multi-monitor setups, and "7 to 8 W" for single-monitor ones.
Sources: Intel GPU Issue Tracker, VideoCardz
Add your own comment

48 Comments on Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs

#1
Hyderz
this is awesome news as intel continues to improve its discrete gpus
Posted on Reply
#2
TumbleGeorge
Mmm multi... of some combinations of...two monitors that is not exactly multi. Intel has many works to do before drivers are enough better for normal power consumption in real multimonitor regime when "idle".
Posted on Reply
#3
ZoneDymo
This is just good stuff and its really smart imo to keep this up as it will just make the next generation more and more a viable choice with the added incentive of actually wanting to support a 3rd player in the space.
Posted on Reply
#4
KrazyT
Good to see Intel still improving drivers !
They don't give up and it's far from perfect ... but who knows ?
I was kindly joking at them when they started with their ARC GPU, and now it's seems they're doing a serious job with it.
Posted on Reply
#5
konga
I recommend that everyone reading this should click on the videocardz link and actually read the table in that article.

In short, No, Intel did not fix high multi-monitor power draw for most multi-monitor users.
Posted on Reply
#6
Karti
kongaI recommend that everyone reading this should click on the videocardz link and actually read the table in that article.

In short, No, Intel did not fix high multi-monitor power draw for most multi-monitor users.
Emm.. Are you sure about that?



It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing
Posted on Reply
#7
Karti
KrazyTGood to see Intel still improving drivers !
They don't give up and it's far from perfect ... but who knows ?
I was kindly joking at them when they started with their ARC GPU, and now it's seems they're doing a serious job with it.
after last HWU video about Intel ARC (with latest updates) these cards really went up in eyes of people

it is nice to see that with current performance (and price) A750 is literally a better pick for 1440p than RX 6650 XT in a "cost per frame" chart
Posted on Reply
#8
TumbleGeorge
So it's time to totally and officially reduce the prices of the entire rDNA 2 generation of video cards, once again! It's also time for the mid-range rDNA 3 graphics cards to hit the market.
Posted on Reply
#9
Bomby569
they fix their driver issues faster then AMD, like a decade fast :D
Posted on Reply
#10
dgianstefani
TPU Proofreader
Bomby569they fix their driver issues faster then AMD, like a decade fast :D
Yeah well, Intel, like NVIDIA, actually have large software teams.
Posted on Reply
#11
ZoneDymo
dgianstefaniYeah well, Intel, like NVIDIA, actually have large software teams.
I get what you are saying but can we please stop acting like AMD is some mom and pop shop?
Posted on Reply
#12
dgianstefani
TPU Proofreader
ZoneDymoI get what you are saying but can we please stop acting like AMD is some mom and pop shop?
They're not, which is why there's no excuse.
Posted on Reply
#13
docnorth
TumbleGeorgeMmm multi... of some combinations of...two monitors that is not exactly multi. Intel has many works to do before drivers are enough better for normal power consumption in real multimonitor regime when "idle".
Indeed not totally fixed, but still a positive reaction.
Posted on Reply
#14
chrcoluk
Was this issue mentioned in review of the product? as to me thats quite nasty in the era of the energy crisis.

But good its been fixed now, shameful not mentioned though as without public record of the problem the ignorant wont realise the impact on their bills.
Posted on Reply
#15
Wirko
Higher power draw is not occuring only in Intel cards, what's the general reason? Frame buffer bandwidth too high for VRAM and IMC to enter a lower power state? It's possible that Intel can't fix everything in software, or even a hardware revision.
Posted on Reply
#16
TheinsanegamerN
chrcolukWas this issue mentioned in review of the product? as to me thats quite nasty in the era of the energy crisis.

But good its been fixed now, shameful not mentioned though as without public record of the problem the ignorant wont realise the impact on their bills.
If 30w impacts you you dont have the budget for a $350 GPU in the first place KTHNX.
Posted on Reply
#17
chrcoluk
TheinsanegamerNIf 30w impacts you you dont have the budget for a $350 GPU in the first place KTHNX.
For reference a years worth of 30 watts pays for a ARC A750 in a year.
Posted on Reply
#18
Vayra86
dgianstefaniThey're not, which is why there's no excuse.
Amen, I just cannot understand why they drop the ball every odd gen. Its so so stupid, they might as well start tossing bricks at their own offices instead.
Posted on Reply
#19
TheinsanegamerN
chrcolukFor reference a years worth of 30 watts pays for a ARC A750 in a year.
*sigh* nope. Not even close. People are REALLY bad at math.

If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.

A new A750 is $250.

I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
Posted on Reply
#20
Arkz
I wish everyone would sort out their idle consumption. My 5600X idle draws about 40w for some reason, and letting it drop the clocks low or idle at 4.5GHz seems to make little difference. 3600 is about 30w idle.

3080 pulls 15-20w idle on a single 4k screen. But my 6700xt pulls about 7-9w idle driving 2x 4k screens.

Radeon 7000 still has stupid high consumption on dual display doesn't it?
TheinsanegamerN*sigh* nope. Not even close. People are REALLY bad at math.

If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.

A new A750 is $250.

I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
They did say a years worth, not 8h a day for a year. But still nowhere close.
Posted on Reply
#21
TheinsanegamerN
ArkzThey did say a years worth, not 8h a day for a year. But still nowhere close.
Then it's still wrong. At 24hrs a day its still only $105 in a year. And that's idling 24 hrs a day. If you are buying a machine and letting it idle 24 hrs a day then whining about power use, there is something seriously wrong with you.
Posted on Reply
#22
ThrashZone
Hi,
If it wasn't a problem they wouldn't of fixed it :laugh:
Posted on Reply
#23
Vya Domus
dgianstefaniwhich is why there's no excuse.
How do you know that ? My God, so many software engineers on here than know better than the ones the multi billion dollars companies hire.

There may very well be good reasons why some of these issues are never fixed, hardware of software changes that would be too costly or complicated no matter how large their software development teams are. There are plenty of things which have never been addressed by all of these companies, like how Nvidia still doesn't support 8bit color dithering to this very date on their GPUs. I was shocked when I switched to AMD at how less noticeable the color banding was. Why is that ? Supposedly Nvidia has many more resources yet this has never be changed, you buy a 2000$ card that has worse color reproduction than a 200$ one, what excuse is there for that ?
Posted on Reply
#24
ThomasK
KartiEmm.. Are you sure about that?



It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing
This table explains why this, otherwise relevant improvement update, was "quietly" made.
Posted on Reply
#25
Squared
AMD has had an issue with multi-monitor power draw being very high, much higher than "idle" for generations. (But who buys an expensive graphics card and only connects one monitor? I think baseline "idle" should include two monitors.)

40W on the RX 480
www.techpowerup.com/review/amd-rx-480/22.html

34W on the RX 5600 XT
www.techpowerup.com/review/powercolor-radeon-rx-5600-xt-itx/31.html

85W on the RX 7900 XT
www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html

AMD still releases the latest drivers for the RX 480 despite its age, where Intel Gen11 graphics are years newer but aren't supported anymore. But if Intel does this much better than AMD at graphics fixes, maybe a short support life from Intel is better than a long one from AMD.
Posted on Reply
Add your own comment
Dec 26th, 2024 14:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts