Monday, March 6th 2023

Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs

Without mentioning it in its driver change-log, Intel Graphics has quietly addressed the issue of unusually high power-draw for its Arc A-series GPUs in multi-monitor setups. The older 101.4091 drivers had a typical single-monitor idle power-draw of around 11 W, which would shoot up to 40 W idle in multi-monitor setups. In our own launch-day review of the Arc A770, we logged a 44 W multi-monitor power-draw. Intel now claims that the multi-monitor idle power-draw has been tamed, with the latest 101.4146 drivers the company released last week lowering this down to "8 to 9 W" for multi-monitor setups, and "7 to 8 W" for single-monitor ones.
Sources: Intel GPU Issue Tracker, VideoCardz
Add your own comment

48 Comments on Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs

#26
Vya Domus
SquaredBut who buys an expensive graphics card and only connects one monitor?
I do.

Well, not for long, I ordered another monitor, I'll see how much of an issue this really is. If am to take every comment on here about how bad the idle power consumption is (typically from non AMD users) and how bad AMD is as a result I am expecting hell to break loose at the very least or I am going to be very disappointed.
Posted on Reply
#27
AnotherReader
SquaredAMD has had an issue with multi-monitor power draw being very high, much higher than "idle" for generations. (But who buys an expensive graphics card and only connects one monitor? I think baseline "idle" should include two monitors.)

85W on the RX 7900 XT
www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html
The multi monitor power draw is more nuanced than a single test would tell. The 7000 series, while still having relatively high multi monitor power consumption, is now lower than at launch: 40 W for the 7900XT. They also don't have high power draw if the monitors have matching refresh rates and resolutions. This holds true for multiple 4k monitors too. In that situation, Intel has the higher power draw, according to their own notes.

Posted on Reply
#28
playerlorenzo
Man, these ARC cards are looking more promising day by day with all these driver improvements. They could bode well as a third challenger to the AMD/Nvidia duopoly
Posted on Reply
#29
Assimilator
TheinsanegamerNIf 30w impacts you you dont have the budget for a $350 GPU in the first place KTHNX.
What a childish, selfish statement to make. Multiply those thirty watts by the number of GPUs sold and all of a sudden it makes an enormous difference. Wasting energy is never acceptable, especially if the competition has figured out how not to.
Vya DomusHow do you know that ? My God, so many software engineers on here than know better than the ones the multi billion dollars companies hire.

There may very well be good reasons why some of these issues are never fixed, hardware of software changes that would be too costly or complicated no matter how large their software development teams are. There are plenty of things which have never been addressed by all of these companies, like how Nvidia still doesn't support 8bit color dithering to this very date on their GPUs. I was shocked when I switched to AMD at how less noticeable the color banding was. Why is that ? Supposedly Nvidia has many more resources yet this has never be changed, you buy a 2000$ card that has worse color reproduction than a 200$ one, what excuse is there for that ?
There is no good reason for the same, incredibly obvious, defect to persist through AT LEAST THREE GENERATIONS of product. Zero. None. Stop making excuses for AMD, and stop using whataboutism to try to deflect attention away from their persistent failure.
Posted on Reply
#30
Arkz
SquaredAMD has had an issue with multi-monitor power draw being very high, much higher than "idle" for generations. (But who buys an expensive graphics card and only connects one monitor? I think baseline "idle" should include two monitors.)

40W on the RX 480
www.techpowerup.com/review/amd-rx-480/22.html

34W on the RX 5600 XT
www.techpowerup.com/review/powercolor-radeon-rx-5600-xt-itx/31.html

85W on the RX 7900 XT
www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html

AMD still releases the latest drivers for the RX 480 despite its age, where Intel Gen11 graphics are years newer but aren't supported anymore. But if Intel does this much better than AMD at graphics fixes, maybe a short support life from Intel is better than a long one from AMD.
They obviously fix some of it, given that other post shows 19w idle on a 7900XT driving 2 4k screens. My 6700XT idles at 9W with 2 4k screens.
Posted on Reply
#31
chrcoluk
TheinsanegamerN*sigh* nope. Not even close. People are REALLY bad at math.

If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.

A new A750 is $250.

I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
Sure I can do the maths.

PC on for 16 hours a day 30 watts at 55p unit. Thats 27p day, multiple by 365 days. £99, add VAT thats just shy of £120, but I was thinking of the £150 card, so my mistake there. (not a full mistake if the £150 also affected)

Curious though why does wasting power been brought up offend you so much?
Posted on Reply
#32
TumbleGeorge
chrcolukCurious though why does wasting power been brought up offend you so much?
Because he thinks it is his human right to be able to waste as much energy as he wants, as long as he can afford to pay for it. F*ck the natural environment?
Posted on Reply
#33
TheinsanegamerN
AssimilatorWhat a childish, selfish statement to make.
What a great start we are off to.
AssimilatorMultiply those thirty watts by the number of GPUs sold and all of a sudden it makes an enormous difference.
Bud, that's still nothing. We are heading into a world where using upwards of 10kWh per day just to drive to work for hundreds of millions of people is perfectly acceptable, and you're worried about 30w idle consumption of a card that will, at best, 2-3 million? Even if you want to limit scope to "waste", the 30w of GPUs is such a nothingburger in terms of lost power. A single school building leaving its lights on overnight wastes more than that in a few minutes!

There's mountains, there's molehills, and then there's the grain of sand you are trying to turn into a molehill.
AssimilatorThere is no good reason for the same, incredibly obvious, defect to persist through AT LEAST THREE GENERATIONS of product. Zero. None. Stop making excuses for AMD, and stop using whataboutism to try to deflect attention away from their persistent failure. Wasting energy is never acceptable, especially if the competition has figured out how not to.
The competition has had similar issues for years, AMD only recently started fixing it (and it isnt totally fixed). Almost like drivers r hard. But I'm sure you could do a better job.
chrcolukSure I can do the maths.

PC on for 16 hours a day 30 watts at 55p unit. Thats 27p day, multiple by 365 days. £99, add VAT thats just shy of £120, but I was thinking of the £150 card, so my mistake there. (not a full mistake if the £150 also affected)
So, yeah, like $105. If that's a lot of money to you, first question is why are you buying a $350 card, and second, why on earth is it on, at IDLE, for 16 hours a day? Not using it, at IDLE. Seems like you have a bigger issue with waste from improper use of a PC at that point. If you have a card like this, in a work PC, then ideally it should be DOING something for that timeframe. If its just doing web browsing or other non heavy work, then having said card is pointless and the far more efficient APUs should be in your PC instead.
chrcolukCurious though why does wasting power been brought up offend you so much?
The problem is people bring up idle power like it is going to bankrupt the world because of a few watts of power. It's not, and claiming things like "the idle power from a year will cost as much as an A770" is totally incorrect. People are blowing this "idle power" issue WAY out of proportion acting like it will take an entire paycheck and nobody but the rich can afford to run a PC as if they were some gas guzzling hypercar. They're not.

People will then take that incorrect information and use it to make ignorant decisions about hardware. "oh why would I buy an AMD card, look how much higher its idle usage is, I better get an nvidia card instead for $200 more" and in the process wasting nearly $200 because of an ignorant knowledge of both math and money. They'll prattle on about how much more efficient their GPU is while ignoring the massive carbon ramifications from them building a PC to play some bing bing wahoo. Gaming, at its core, is a massively wasteful hobby, and the moran grandstanding about power use is supremely annoying.

Financial illiteracy is a major trigger. Like the imbeciles who bought toyota prius' on a 5 year loan with 8% interest because gas went to $5, instead of keeping their perfectly serviceable paid off vehicle, then prance around talking about how much money they are "saving".
TumbleGeorgeBecause he thinks it is his human right to be able to waste as much energy as he wants, as long as he can afford to pay for it. F*ck the natural environment?
You have no idea what I think, and your uninformed rambling has no relation to what I think. I have on many occasions advocated for the use of nuclear power to eliminate fossil fuel usage and lamented ont he waste of resources in a variety of manners. If you think a couple dozen watts of electricity from cards for a niche hobby is a major issue, oh baby just WAIT until I introduce to you the environmental catastrophe that is silicon chip production!
Posted on Reply
#34
TumbleGeorge
TheinsanegamerNjust WAIT until I introduce to you the environmental catastrophe that is silicon chip production!
I can't wait for such a thread in the forum. I don't know if it will be allowed at all except in the science& technology section?
Posted on Reply
#35
trsttte
TheinsanegamerNWe are heading into a world where using upwards of 10kWh per day just to drive to work for hundreds of millions of people is perfectly acceptable, and you're worried about 30w idle consumption of a card that will, at best, 2-3 million
I'm very sorry you had to go there because that's a very dumb statement, electric vehicles are insanely more efficient than gas powered ones, and that's only accounting for the vehicle itself, the supply chain for the fuel source is also much better for electric than gas. When we all finally move to ev's the global energy needs will decrease substantially

I'm with you that for a regular user the difference between 30w or 10w idle doesn't really move the needle enough to matter, but we should still criticize and applaud when companies fail or fix this kinds of things, like Assimilitor said, it adds up, waste just for waste's sake is stupid.
TheinsanegamerNA single school building leaving its lights on overnight wastes more than that in a few minutes!
And we should fix that too!
TheinsanegamerNThe problem is people bring up idle power like it is going to bankrupt the world because of a few watts of power. It's not, and claiming things like "the idle power from a year will cost as much as an A770" is totally incorrect. People are blowing this "idle power" issue WAY out of proportion acting like it will take an entire paycheck and nobody but the rich can afford to run a PC as if they were some gas guzzling hypercar. They're not.

People will then take that incorrect information and use it to make ignorant decisions about hardware. "oh why would I buy an AMD card, look how much higher its idle usage is, I better get an nvidia card instead for $200 more" and in the process wasting nearly $200 because of an ignorant knowledge of both math and money.

Financial illiteracy is a major trigger. Like the imbeciles who bought toyota prius' on a 5 year loan with 8% interest because gas went to $5, instead of keeping their perfectly serviceable paid off vehicle, then prance around talking about how much money they are "saving".
"Show me on this diagram here where power efficiency hurt you" :D I think you're much more triggered than anyone else but whatever. People make stupid decisions everyday. Like we're talking about idle power but ignoring the part where the cards are not that efficient fps/W when running vs the competition, just like intel CPUs which are pretty terrible.

The cards had a problem that could be fixed and it was fixed at least partially, and that's great! wow, so much drama on something so simple.
TheinsanegamerNIf you think a couple dozen watts of electricity from cards for a niche hobby is a major issue, oh baby just WAIT until I introduce to you the environmental catastrophe that is silicon chip production!
Effort vs impact, fixing the idle power of consumer products generally speaking might not have the greatest impact compared with industrial activities but it's still something very much achievable with low effort.
Posted on Reply
#36
Assimilator
TheinsanegamerNI have on many occasions advocated for the use of nuclear power to eliminate fossil fuel usage and lamented ont he waste of resources in a variety of manners.
And yet you have a nuclear meltdown when someone else laments a waste of resources. So I'm gonna go ahead and say you're full of s**t.
Posted on Reply
#37
Vya Domus
AssimilatorThere is no good reason for the same
How do you know that ? Like, for a fact I mean, not in a "I'm a fanboy who doesn't like AMD so I am going to complain about something that doesn't even impact me in any way" kind of way. Do you think there is someone at AMD where every time they get a bug report about the idle power thing they go "Hmm, should we fix this ? Nah." They obviously work on fixing many other things, so they're not lazy, if there isn't a "good reason" to not fix it, why hasn't been resolved ?

By the way, do you even own an AMD card ? Why do you even care ?
Posted on Reply
#38
Mussels
Freshwater Moderator
Oh good
Intel Nvidia and AMD all have issues with this


After spending a lot of time with CRU on their forums (and blurbusters) overclocking various displays, it seems to come down to how various monitors implement their timings - theres standards for LCD's, reduced blanking, and a second generation of the reduced blanking

As you reduce the blanking more and more you can get more and more bandwidth to the displays for higher refresh rates, but you also begin to run into these issues where GPU's can't identify how much performance they need to run these non-standard setups (Hence may displays saying things like 144Hz with "160Hz OC")

As an example my 4K60Hz displays support DP 1.2 and 1.4 respectively, but internally they both have the same limit of around 570Mhz at stock or 590Mhz overclocked - with reduced blanking that lets me get 65Hz with 10 bit colour out of them, but they lose support for HDR and freesync by doing so and some older GPU's here (like my 10 series GPUs) ramp up their idle clocks
Despite using the same panel, the same HDMI standard and same bandwidth limits (despite different DP standards) they both implement different timings and blanking values.
  • Vertical Blanking Interval -- Known as VBI. The entire interval between refresh cycles. The sum of (Vertical Back Porch + Vertical Sync + Vertical Front Porch).
I understand the words, but this image apparently contains the visual explanation even if i can't really grasp it's meaning - shrinking the 'porch' gives more bandwidth to the display, but less time for the GPU to do it's thing so they clock up to compensate




Ironically, these overclocking methods to get higher refresh rates from the factory can result in more blurring and smearing along with the higher GPU idle clocks, or compression artifcacts like the samsung panels are prone to on their high refresh rate displays (the 240Hz displays have artifacting, but a custom refresh of 165Hz is problem free)


TL;DR: More idle time lets the pixels revert back to black/neutral before changing to the next colour, even at the same refresh rate, and GPU's can also benefit from that idle time to clock lower.
Posted on Reply
#39
Assimilator
Vya DomusHow do you know that ?
If NVIDIA can consistently get this right, then logically it's not impossible.
If AMD can fix it with subsequent driver updates, then logically it's not impossible.
If AMD can fix it with subsequent driver updates, then logically they can avoid it being a problem on release day.
Vya DomusLike, for a fact I mean, not in a "I'm a fanboy who doesn't like AMD so I am going to complain about something that doesn't even impact me in any way" kind of way.
Yes, I'm such a fanboy who doesn't like AMD that my current CPU and motherboard are from them. Seriously, grow up and learn how to make a proper argument.
Vya DomusDo you think there is someone at AMD where every time they get a bug report about the idle power thing they go "Hmm, should we fix this ? Nah." They obviously work on fixing many other things, so they're not lazy, if there isn't a "good reason" to not fix it, why hasn't been resolved ?
If they made sure it wasn't a problem on launch day, they wouldn't get bug reports about it and they wouldn't have to spend time and effort triaging those reports and fixing the issue. In other words they'd save time by just taking the time before launch to make this work.
Vya DomusBy the way, do you even own an AMD card ? Why do you even care ?
For the same reason that I care about climate change. For the same reason that your "argument" isn't an argument, just another attempt to deflect from the topic at hand. I will grant you that it's an even more pathetic attempt at deflection than your previous one, which is impressive in its own right considering how incredibly sad the previous attempt was.
Posted on Reply
#40
Vya Domus
AssimilatorIf NVIDIA can consistently get this right, then logically it's not impossible.
I don't think you understand the question, do you have any technical knowledge or proof this could be easly fixed and AMD choose not to because they're lazy or whatever ? I work in software development, some things aren't fixable, as easy as they may seem to fix from the perspective of a consumer.

"If X company did it it must meant that Y company can do it" is a catastrophically unintelligent point you tried to make.
AssimilatorIf AMD can fix it with subsequent driver updates, then logically they can avoid it being a problem on release day.
Yeah because the time it takes to fix something is always zero. You have no knowledge whatsoever on matters that have to do with software development and it shows, you really should stop talking about this, you're clearly out of your element, this was an incredibly dumb statement.
Assimilatorlearn how to make a proper argument.
And have you made one ? I must have missed it.
AssimilatorFor the same reason that I care about climate change.
So you don't. I just find it interesting the only people that are very vocal about this are the ones who don't own an AMD card.
Posted on Reply
#41
konga
KartiEmm.. Are you sure about that?



It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing
Yes, I'm sure about that. Did you actually read the chart? Idle power draw for multi-monitor setups is only fixed for people using 1080p 60hz displays. That may as well be not fixed at all for most people with multi-monitor setups. Small steps are not better than nothing if they don't actually fix anything for you. They literally are just the same as nothing.
Posted on Reply
#42
Assimilator
Vya DomusI don't think you understand the question, do you have any technical knowledge or proof this could be easly fixed and AMD choose not to because they're lazy or whatever ? I work in software development, some things aren't fixable, as easy as they may seem to fix from the perspective of a consumer.
Yes. The fact that they generally mostly fix it shortly after release date.
Vya Domus"If X company did it it must meant that Y company can do it" is a catastrophically unintelligent point you tried to make.
No it's not. It proves that fixing it (or, just getting it right from the outset) isn't impossible. Which refutes one of the claims you've been trying to make.
Vya DomusYeah because the time it takes to fix something is always zero. You have no knowledge whatsoever on matters that have to do with software development and it shows, you really should stop talking about this, you're clearly out of your element, this was an incredibly dumb statement.
It's precisely because I've been working in the software industry for nearly two decades that I have so little patience for incompetence. If I can fix a bug and it stays fixed the next time we release a major version, why can't AMD?
Vya DomusI just find it interesting the only people that are very vocal about this are the ones who don't own an AMD card.
Have you maybe considered that I don't own an AMD card precisely because I have no interest in supporting their GPU division when they can't get the basics right consistently?
Posted on Reply
#43
Vya Domus
AssimilatorHave you maybe considered that I don't own an AMD card precisely because I have no interest in supporting their GPU division when they can't get the basics right consistently?
I am considering that you are bashing onto a product that you don't own and you don't know anything about, as I am about to show you, you're just regurgitating some boilerplate "AMD bad" rhetoric.

Just got another monitor, this is what my idle power consumption looks like with a 1440p 165hz monitor and a 4K 60hz one :



22 W, clearly unacceptable, since this upsets you so much, can you get mad on my behalf and write them an angry email ? Just copy paste all your comments from this thread, I think that will suffice.
Posted on Reply
#44
chrcoluk
Ok I have been told. :) I need to use my PC either for gaming or work, not both, so buy a 2nd PC with APU for work and non gaming tasks, and then just use dGPU for gaming, better than manufacturer making power efficient drivers lol.
Posted on Reply
#45
Mussels
Freshwater Moderator
chrcolukOk I have been told. :) I need to use my PC either for gaming or work, not both, so buy a 2nd PC with APU for work and non gaming tasks, and then just use dGPU for gaming, better than manufacturer making power efficient drivers lol.
Those low power APU's wont be able to drive high res and refresh displays

Blurbusters and the forums related to CRU cover it really well since monitor overclocking (and underclocking) can trigger (or fix) the same problems, since monitors are using non-standard timings to push higher resolutions and refresh rates without using the latest HDMI or DP standards that support it they just make custom settings to fit the available bandwidth and the drivers can't identify WTF it needs, so they run at full speed to make sure you get an image at all.
Posted on Reply
#46
chrcoluk
MusselsThose low power APU's wont be able to drive high res and refresh displays

Blurbusters and the forums related to CRU cover it really well since monitor overclocking (and underclocking) can trigger (or fix) the same problems, since monitors are using non-standard timings to push higher resolutions and refresh rates without using the latest HDMI or DP standards that support it they just make custom settings to fit the available bandwidth and the drivers can't identify WTF it needs, so they run at full speed to make sure you get an image at all.
Yeah I have heard of this issue, the LG that I own has this exact issue, they under spec'd the DP port for the required bandwidth and instead overclocked the timings to make it work. I tried to get reviewers to cover the story after i found out about it, but as usual none were bothered. However it isnt affecting my GPU power state, so maybe Nvidia doesnt have the issue or my timings are not extreme enough.

Will have a read of the blurbuster forums, thanks.

-- I run my desktop only at 60hz ,so that might be the actual reason I am not affected, when I tried 120hz was also , but from my tests the timings only go out of spec at above 120hz (144hz), I didnt check the gpu power state when I tested 144hz.
Posted on Reply
#47
Mussels
Freshwater Moderator
chrcolukYeah I have heard of this issue, the LG that I own has this exact issue, they under spec'd the DP port for the required bandwidth and instead overclocked the timings to make it work. I tried to get reviewers to cover the story after i found out about it, but as usual none were bothered. However it isnt affecting my GPU power state, so maybe Nvidia doesnt have the issue or my timings are not extreme enough.

Will have a read of the blurbuster forums, thanks.

-- I run my desktop only at 60hz ,so that might be the actual reason I am not affected, when I tried 120hz was also , but from my tests the timings only go out of spec at above 120hz (144hz), I didnt check the gpu power state when I tested 144hz.
4K displays above 60Hz use compression to achieve those refresh rates, or they lower from RGB/4:4:4 to 4:2:0 (HDR does this too)

On older GPU's there was a 150Mhz? limit on some standards, then 300Mhz for single link DVI (and HDMI), with dual link being 600Mhz (the current common maximum, without compression)

Almost all GPU's cant run all the ports at the same time (30 series were limited to 4 outputs at a time, despite some having 5 ports - and you cant run 4k 144Hz on all 4 at once, either)

Then it comes down to the VRAM needing enough performance to refresh faster than the monitors blanking interval, and that's the easiest thing to lower to get more bandwidth out of a display - it's how those 144Hz displays have a 165Hz "OC" mode.

These driver fixes would be adding in "common" situations they've tested and verified (4k 120hz with compression on 2x displays for example) and enabling some default safeties, like forcing 8 bit colour depth or locking to 60hz with 3? 4? displays - but all it takes is a user adding in another display, enabling HDR, using some weird DP/HDMI converter, changing from 8 to 10 or 12 bit colour depth, and suddenly the bandwidth requirements are all over the place.
Posted on Reply
#48
trsttte
Mussels4K displays above 60Hz use compression to achieve those refresh rates, or they lower from RGB/4:4:4 to 4:2:0 (HDR does this too)
If only we had new standards that allowed this... They've been out for only about 5 years, maybe another 5 and this stops being a problem :shadedshu:

HDMI is the worst of the 2 with the crap they pulled calling anything 2.1, not that it changed much, just allowing for anything other than FRL6 was already completely stupid, TMDS just added insult to injury
Posted on Reply
Add your own comment
Nov 25th, 2024 01:52 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts