• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Quietly Fixes High Multi-monitor Power Draw of Arc GPUs

Joined
Aug 12, 2022
Messages
250 (0.29/day)
AMD has had an issue with multi-monitor power draw being very high, much higher than "idle" for generations. (But who buys an expensive graphics card and only connects one monitor? I think baseline "idle" should include two monitors.)

40W on the RX 480
https://www.techpowerup.com/review/amd-rx-480/22.html

34W on the RX 5600 XT
https://www.techpowerup.com/review/powercolor-radeon-rx-5600-xt-itx/31.html

85W on the RX 7900 XT
https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html

AMD still releases the latest drivers for the RX 480 despite its age, where Intel Gen11 graphics are years newer but aren't supported anymore. But if Intel does this much better than AMD at graphics fixes, maybe a short support life from Intel is better than a long one from AMD.
 
Joined
Jan 8, 2017
Messages
9,520 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
But who buys an expensive graphics card and only connects one monitor?
I do.

Well, not for long, I ordered another monitor, I'll see how much of an issue this really is. If am to take every comment on here about how bad the idle power consumption is (typically from non AMD users) and how bad AMD is as a result I am expecting hell to break loose at the very least or I am going to be very disappointed.
 
Joined
Nov 26, 2021
Messages
1,705 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
AMD has had an issue with multi-monitor power draw being very high, much higher than "idle" for generations. (But who buys an expensive graphics card and only connects one monitor? I think baseline "idle" should include two monitors.)

85W on the RX 7900 XT
https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html
The multi monitor power draw is more nuanced than a single test would tell. The 7000 series, while still having relatively high multi monitor power consumption, is now lower than at launch: 40 W for the 7900XT. They also don't have high power draw if the monitors have matching refresh rates and resolutions. This holds true for multiple 4k monitors too. In that situation, Intel has the higher power draw, according to their own notes.

1678121174084.png
 
Joined
Jan 17, 2023
Messages
28 (0.04/day)
Processor Ryzen 5 4600G
Motherboard Gigabyte B450M DS3H
Cooling AMD Stock Cooler
Memory 3200Mhz CL16-20-20-40 memory
Video Card(s) ASRock Challenger RX 6600
Storage 1TB WD SN570 NVME SSD and 2TB Seagate 7200RPM HDD
Software Windows 11 Pro
Man, these ARC cards are looking more promising day by day with all these driver improvements. They could bode well as a third challenger to the AMD/Nvidia duopoly
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
If 30w impacts you you dont have the budget for a $350 GPU in the first place KTHNX.
What a childish, selfish statement to make. Multiply those thirty watts by the number of GPUs sold and all of a sudden it makes an enormous difference. Wasting energy is never acceptable, especially if the competition has figured out how not to.

How do you know that ? My God, so many software engineers on here than know better than the ones the multi billion dollars companies hire.

There may very well be good reasons why some of these issues are never fixed, hardware of software changes that would be too costly or complicated no matter how large their software development teams are. There are plenty of things which have never been addressed by all of these companies, like how Nvidia still doesn't support 8bit color dithering to this very date on their GPUs. I was shocked when I switched to AMD at how less noticeable the color banding was. Why is that ? Supposedly Nvidia has many more resources yet this has never be changed, you buy a 2000$ card that has worse color reproduction than a 200$ one, what excuse is there for that ?
There is no good reason for the same, incredibly obvious, defect to persist through AT LEAST THREE GENERATIONS of product. Zero. None. Stop making excuses for AMD, and stop using whataboutism to try to deflect attention away from their persistent failure.
 
Joined
Jun 16, 2019
Messages
387 (0.19/day)
System Name Cyberdyne Systems Core
Processor AMD Sceptre 9 3950x Quantum neural processor (384 nodes)
Motherboard Cyberdyne X1470
Memory 128TB QRAM
Video Card(s) CDS Render Accelerator 4TB
Storage SK 16EB NVMe PCI-E 9.0 x8
Display(s) LG C19 3D Environment Projection System
Power Supply Compact Fusion Cell
Software Skysoft Skynet
AMD has had an issue with multi-monitor power draw being very high, much higher than "idle" for generations. (But who buys an expensive graphics card and only connects one monitor? I think baseline "idle" should include two monitors.)

40W on the RX 480
https://www.techpowerup.com/review/amd-rx-480/22.html

34W on the RX 5600 XT
https://www.techpowerup.com/review/powercolor-radeon-rx-5600-xt-itx/31.html

85W on the RX 7900 XT
https://www.techpowerup.com/review/amd-radeon-rx-7900-xtx/37.html

AMD still releases the latest drivers for the RX 480 despite its age, where Intel Gen11 graphics are years newer but aren't supported anymore. But if Intel does this much better than AMD at graphics fixes, maybe a short support life from Intel is better than a long one from AMD.
They obviously fix some of it, given that other post shows 19w idle on a 7900XT driving 2 4k screens. My 6700XT idles at 9W with 2 4k screens.
 
Joined
Feb 1, 2019
Messages
3,669 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
*sigh* nope. Not even close. People are REALLY bad at math.

If you used your computer at idle for 8 hours a day, 365 days a year, that would be 87,600w VS the new driver. That is 87.6kWh. At current German electric prices of $.40 per kWh, that is $35 in electricity per year.

A new A750 is $250.

I shouldnt have to do the math every single time the use of electricity comes up. Can people please learn basic multiplication and division already? If electric bills are a major concern for you you cannot afford the hardware in question. PERIOD. The math DOES. NOT. WORK. OUT.
Sure I can do the maths.

PC on for 16 hours a day 30 watts at 55p unit. Thats 27p day, multiple by 365 days. ÂŁ99, add VAT thats just shy of ÂŁ120, but I was thinking of the ÂŁ150 card, so my mistake there. (not a full mistake if the ÂŁ150 also affected)

Curious though why does wasting power been brought up offend you so much?
 
Joined
Sep 1, 2020
Messages
2,408 (1.53/day)
Location
Bulgaria
Curious though why does wasting power been brought up offend you so much?
Because he thinks it is his human right to be able to waste as much energy as he wants, as long as he can afford to pay for it. F*ck the natural environment?
 
Joined
Dec 28, 2012
Messages
3,969 (0.91/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
What a childish, selfish statement to make.
What a great start we are off to.
Multiply those thirty watts by the number of GPUs sold and all of a sudden it makes an enormous difference.
Bud, that's still nothing. We are heading into a world where using upwards of 10kWh per day just to drive to work for hundreds of millions of people is perfectly acceptable, and you're worried about 30w idle consumption of a card that will, at best, 2-3 million? Even if you want to limit scope to "waste", the 30w of GPUs is such a nothingburger in terms of lost power. A single school building leaving its lights on overnight wastes more than that in a few minutes!

There's mountains, there's molehills, and then there's the grain of sand you are trying to turn into a molehill.
There is no good reason for the same, incredibly obvious, defect to persist through AT LEAST THREE GENERATIONS of product. Zero. None. Stop making excuses for AMD, and stop using whataboutism to try to deflect attention away from their persistent failure. Wasting energy is never acceptable, especially if the competition has figured out how not to.
The competition has had similar issues for years, AMD only recently started fixing it (and it isnt totally fixed). Almost like drivers r hard. But I'm sure you could do a better job.

Sure I can do the maths.

PC on for 16 hours a day 30 watts at 55p unit. Thats 27p day, multiple by 365 days. ÂŁ99, add VAT thats just shy of ÂŁ120, but I was thinking of the ÂŁ150 card, so my mistake there. (not a full mistake if the ÂŁ150 also affected)
So, yeah, like $105. If that's a lot of money to you, first question is why are you buying a $350 card, and second, why on earth is it on, at IDLE, for 16 hours a day? Not using it, at IDLE. Seems like you have a bigger issue with waste from improper use of a PC at that point. If you have a card like this, in a work PC, then ideally it should be DOING something for that timeframe. If its just doing web browsing or other non heavy work, then having said card is pointless and the far more efficient APUs should be in your PC instead.
Curious though why does wasting power been brought up offend you so much?
The problem is people bring up idle power like it is going to bankrupt the world because of a few watts of power. It's not, and claiming things like "the idle power from a year will cost as much as an A770" is totally incorrect. People are blowing this "idle power" issue WAY out of proportion acting like it will take an entire paycheck and nobody but the rich can afford to run a PC as if they were some gas guzzling hypercar. They're not.

People will then take that incorrect information and use it to make ignorant decisions about hardware. "oh why would I buy an AMD card, look how much higher its idle usage is, I better get an nvidia card instead for $200 more" and in the process wasting nearly $200 because of an ignorant knowledge of both math and money. They'll prattle on about how much more efficient their GPU is while ignoring the massive carbon ramifications from them building a PC to play some bing bing wahoo. Gaming, at its core, is a massively wasteful hobby, and the moran grandstanding about power use is supremely annoying.

Financial illiteracy is a major trigger. Like the imbeciles who bought toyota prius' on a 5 year loan with 8% interest because gas went to $5, instead of keeping their perfectly serviceable paid off vehicle, then prance around talking about how much money they are "saving".

Because he thinks it is his human right to be able to waste as much energy as he wants, as long as he can afford to pay for it. F*ck the natural environment?
You have no idea what I think, and your uninformed rambling has no relation to what I think. I have on many occasions advocated for the use of nuclear power to eliminate fossil fuel usage and lamented ont he waste of resources in a variety of manners. If you think a couple dozen watts of electricity from cards for a niche hobby is a major issue, oh baby just WAIT until I introduce to you the environmental catastrophe that is silicon chip production!
 
Last edited:
Joined
Sep 1, 2020
Messages
2,408 (1.53/day)
Location
Bulgaria
just WAIT until I introduce to you the environmental catastrophe that is silicon chip production!
I can't wait for such a thread in the forum. I don't know if it will be allowed at all except in the science& technology section?
 
Joined
Jun 18, 2021
Messages
2,582 (2.00/day)
We are heading into a world where using upwards of 10kWh per day just to drive to work for hundreds of millions of people is perfectly acceptable, and you're worried about 30w idle consumption of a card that will, at best, 2-3 million

I'm very sorry you had to go there because that's a very dumb statement, electric vehicles are insanely more efficient than gas powered ones, and that's only accounting for the vehicle itself, the supply chain for the fuel source is also much better for electric than gas. When we all finally move to ev's the global energy needs will decrease substantially

I'm with you that for a regular user the difference between 30w or 10w idle doesn't really move the needle enough to matter, but we should still criticize and applaud when companies fail or fix this kinds of things, like Assimilitor said, it adds up, waste just for waste's sake is stupid.

A single school building leaving its lights on overnight wastes more than that in a few minutes!

And we should fix that too!

The problem is people bring up idle power like it is going to bankrupt the world because of a few watts of power. It's not, and claiming things like "the idle power from a year will cost as much as an A770" is totally incorrect. People are blowing this "idle power" issue WAY out of proportion acting like it will take an entire paycheck and nobody but the rich can afford to run a PC as if they were some gas guzzling hypercar. They're not.

People will then take that incorrect information and use it to make ignorant decisions about hardware. "oh why would I buy an AMD card, look how much higher its idle usage is, I better get an nvidia card instead for $200 more" and in the process wasting nearly $200 because of an ignorant knowledge of both math and money.

Financial illiteracy is a major trigger. Like the imbeciles who bought toyota prius' on a 5 year loan with 8% interest because gas went to $5, instead of keeping their perfectly serviceable paid off vehicle, then prance around talking about how much money they are "saving".

"Show me on this diagram here where power efficiency hurt you" :D I think you're much more triggered than anyone else but whatever. People make stupid decisions everyday. Like we're talking about idle power but ignoring the part where the cards are not that efficient fps/W when running vs the competition, just like intel CPUs which are pretty terrible.

The cards had a problem that could be fixed and it was fixed at least partially, and that's great! wow, so much drama on something so simple.

If you think a couple dozen watts of electricity from cards for a niche hobby is a major issue, oh baby just WAIT until I introduce to you the environmental catastrophe that is silicon chip production!

Effort vs impact, fixing the idle power of consumer products generally speaking might not have the greatest impact compared with industrial activities but it's still something very much achievable with low effort.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I have on many occasions advocated for the use of nuclear power to eliminate fossil fuel usage and lamented ont he waste of resources in a variety of manners.
And yet you have a nuclear meltdown when someone else laments a waste of resources. So I'm gonna go ahead and say you're full of s**t.
 
Joined
Jan 8, 2017
Messages
9,520 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
There is no good reason for the same

How do you know that ? Like, for a fact I mean, not in a "I'm a fanboy who doesn't like AMD so I am going to complain about something that doesn't even impact me in any way" kind of way. Do you think there is someone at AMD where every time they get a bug report about the idle power thing they go "Hmm, should we fix this ? Nah." They obviously work on fixing many other things, so they're not lazy, if there isn't a "good reason" to not fix it, why hasn't been resolved ?

By the way, do you even own an AMD card ? Why do you even care ?
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Oh good
Intel Nvidia and AMD all have issues with this


After spending a lot of time with CRU on their forums (and blurbusters) overclocking various displays, it seems to come down to how various monitors implement their timings - theres standards for LCD's, reduced blanking, and a second generation of the reduced blanking

As you reduce the blanking more and more you can get more and more bandwidth to the displays for higher refresh rates, but you also begin to run into these issues where GPU's can't identify how much performance they need to run these non-standard setups (Hence may displays saying things like 144Hz with "160Hz OC")

As an example my 4K60Hz displays support DP 1.2 and 1.4 respectively, but internally they both have the same limit of around 570Mhz at stock or 590Mhz overclocked - with reduced blanking that lets me get 65Hz with 10 bit colour out of them, but they lose support for HDR and freesync by doing so and some older GPU's here (like my 10 series GPUs) ramp up their idle clocks
Despite using the same panel, the same HDMI standard and same bandwidth limits (despite different DP standards) they both implement different timings and blanking values.

  • Vertical Blanking Interval -- Known as VBI. The entire interval between refresh cycles. The sum of (Vertical Back Porch + Vertical Sync + Vertical Front Porch).

I understand the words, but this image apparently contains the visual explanation even if i can't really grasp it's meaning - shrinking the 'porch' gives more bandwidth to the display, but less time for the GPU to do it's thing so they clock up to compensate
1678163540093.png




Ironically, these overclocking methods to get higher refresh rates from the factory can result in more blurring and smearing along with the higher GPU idle clocks, or compression artifcacts like the samsung panels are prone to on their high refresh rate displays (the 240Hz displays have artifacting, but a custom refresh of 165Hz is problem free)


TL;DR: More idle time lets the pixels revert back to black/neutral before changing to the next colour, even at the same refresh rate, and GPU's can also benefit from that idle time to clock lower.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
How do you know that ?
If NVIDIA can consistently get this right, then logically it's not impossible.
If AMD can fix it with subsequent driver updates, then logically it's not impossible.
If AMD can fix it with subsequent driver updates, then logically they can avoid it being a problem on release day.

Like, for a fact I mean, not in a "I'm a fanboy who doesn't like AMD so I am going to complain about something that doesn't even impact me in any way" kind of way.
Yes, I'm such a fanboy who doesn't like AMD that my current CPU and motherboard are from them. Seriously, grow up and learn how to make a proper argument.

Do you think there is someone at AMD where every time they get a bug report about the idle power thing they go "Hmm, should we fix this ? Nah." They obviously work on fixing many other things, so they're not lazy, if there isn't a "good reason" to not fix it, why hasn't been resolved ?
If they made sure it wasn't a problem on launch day, they wouldn't get bug reports about it and they wouldn't have to spend time and effort triaging those reports and fixing the issue. In other words they'd save time by just taking the time before launch to make this work.

By the way, do you even own an AMD card ? Why do you even care ?
For the same reason that I care about climate change. For the same reason that your "argument" isn't an argument, just another attempt to deflect from the topic at hand. I will grant you that it's an even more pathetic attempt at deflection than your previous one, which is impressive in its own right considering how incredibly sad the previous attempt was.
 
Joined
Jan 8, 2017
Messages
9,520 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
If NVIDIA can consistently get this right, then logically it's not impossible.
I don't think you understand the question, do you have any technical knowledge or proof this could be easly fixed and AMD choose not to because they're lazy or whatever ? I work in software development, some things aren't fixable, as easy as they may seem to fix from the perspective of a consumer.

"If X company did it it must meant that Y company can do it" is a catastrophically unintelligent point you tried to make.

If AMD can fix it with subsequent driver updates, then logically they can avoid it being a problem on release day.
Yeah because the time it takes to fix something is always zero. You have no knowledge whatsoever on matters that have to do with software development and it shows, you really should stop talking about this, you're clearly out of your element, this was an incredibly dumb statement.

learn how to make a proper argument.
And have you made one ? I must have missed it.

For the same reason that I care about climate change.
So you don't. I just find it interesting the only people that are very vocal about this are the ones who don't own an AMD card.
 
Last edited:
Joined
Dec 30, 2021
Messages
394 (0.36/day)
Emm.. Are you sure about that?

View attachment 286626

It was confirmed by people that use 2 screens... yea it is still not done for 3 and 4 screen builds - but small steps like that are better than nothing
Yes, I'm sure about that. Did you actually read the chart? Idle power draw for multi-monitor setups is only fixed for people using 1080p 60hz displays. That may as well be not fixed at all for most people with multi-monitor setups. Small steps are not better than nothing if they don't actually fix anything for you. They literally are just the same as nothing.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I don't think you understand the question, do you have any technical knowledge or proof this could be easly fixed and AMD choose not to because they're lazy or whatever ? I work in software development, some things aren't fixable, as easy as they may seem to fix from the perspective of a consumer.
Yes. The fact that they generally mostly fix it shortly after release date.

"If X company did it it must meant that Y company can do it" is a catastrophically unintelligent point you tried to make.
No it's not. It proves that fixing it (or, just getting it right from the outset) isn't impossible. Which refutes one of the claims you've been trying to make.

Yeah because the time it takes to fix something is always zero. You have no knowledge whatsoever on matters that have to do with software development and it shows, you really should stop talking about this, you're clearly out of your element, this was an incredibly dumb statement.
It's precisely because I've been working in the software industry for nearly two decades that I have so little patience for incompetence. If I can fix a bug and it stays fixed the next time we release a major version, why can't AMD?

I just find it interesting the only people that are very vocal about this are the ones who don't own an AMD card.
Have you maybe considered that I don't own an AMD card precisely because I have no interest in supporting their GPU division when they can't get the basics right consistently?
 
Joined
Jan 8, 2017
Messages
9,520 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Have you maybe considered that I don't own an AMD card precisely because I have no interest in supporting their GPU division when they can't get the basics right consistently?

I am considering that you are bashing onto a product that you don't own and you don't know anything about, as I am about to show you, you're just regurgitating some boilerplate "AMD bad" rhetoric.

Just got another monitor, this is what my idle power consumption looks like with a 1440p 165hz monitor and a 4K 60hz one :

1678220007195.png


22 W, clearly unacceptable, since this upsets you so much, can you get mad on my behalf and write them an angry email ? Just copy paste all your comments from this thread, I think that will suffice.
 
Last edited:
Joined
Feb 1, 2019
Messages
3,669 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Ok I have been told. :) I need to use my PC either for gaming or work, not both, so buy a 2nd PC with APU for work and non gaming tasks, and then just use dGPU for gaming, better than manufacturer making power efficient drivers lol.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Ok I have been told. :) I need to use my PC either for gaming or work, not both, so buy a 2nd PC with APU for work and non gaming tasks, and then just use dGPU for gaming, better than manufacturer making power efficient drivers lol.
Those low power APU's wont be able to drive high res and refresh displays

Blurbusters and the forums related to CRU cover it really well since monitor overclocking (and underclocking) can trigger (or fix) the same problems, since monitors are using non-standard timings to push higher resolutions and refresh rates without using the latest HDMI or DP standards that support it they just make custom settings to fit the available bandwidth and the drivers can't identify WTF it needs, so they run at full speed to make sure you get an image at all.
 
Joined
Feb 1, 2019
Messages
3,669 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Those low power APU's wont be able to drive high res and refresh displays

Blurbusters and the forums related to CRU cover it really well since monitor overclocking (and underclocking) can trigger (or fix) the same problems, since monitors are using non-standard timings to push higher resolutions and refresh rates without using the latest HDMI or DP standards that support it they just make custom settings to fit the available bandwidth and the drivers can't identify WTF it needs, so they run at full speed to make sure you get an image at all.
Yeah I have heard of this issue, the LG that I own has this exact issue, they under spec'd the DP port for the required bandwidth and instead overclocked the timings to make it work. I tried to get reviewers to cover the story after i found out about it, but as usual none were bothered. However it isnt affecting my GPU power state, so maybe Nvidia doesnt have the issue or my timings are not extreme enough.

Will have a read of the blurbuster forums, thanks.

-- I run my desktop only at 60hz ,so that might be the actual reason I am not affected, when I tried 120hz was also , but from my tests the timings only go out of spec at above 120hz (144hz), I didnt check the gpu power state when I tested 144hz.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Yeah I have heard of this issue, the LG that I own has this exact issue, they under spec'd the DP port for the required bandwidth and instead overclocked the timings to make it work. I tried to get reviewers to cover the story after i found out about it, but as usual none were bothered. However it isnt affecting my GPU power state, so maybe Nvidia doesnt have the issue or my timings are not extreme enough.

Will have a read of the blurbuster forums, thanks.

-- I run my desktop only at 60hz ,so that might be the actual reason I am not affected, when I tried 120hz was also , but from my tests the timings only go out of spec at above 120hz (144hz), I didnt check the gpu power state when I tested 144hz.
4K displays above 60Hz use compression to achieve those refresh rates, or they lower from RGB/4:4:4 to 4:2:0 (HDR does this too)

On older GPU's there was a 150Mhz? limit on some standards, then 300Mhz for single link DVI (and HDMI), with dual link being 600Mhz (the current common maximum, without compression)

Almost all GPU's cant run all the ports at the same time (30 series were limited to 4 outputs at a time, despite some having 5 ports - and you cant run 4k 144Hz on all 4 at once, either)

Then it comes down to the VRAM needing enough performance to refresh faster than the monitors blanking interval, and that's the easiest thing to lower to get more bandwidth out of a display - it's how those 144Hz displays have a 165Hz "OC" mode.

These driver fixes would be adding in "common" situations they've tested and verified (4k 120hz with compression on 2x displays for example) and enabling some default safeties, like forcing 8 bit colour depth or locking to 60hz with 3? 4? displays - but all it takes is a user adding in another display, enabling HDR, using some weird DP/HDMI converter, changing from 8 to 10 or 12 bit colour depth, and suddenly the bandwidth requirements are all over the place.
 
Joined
Jun 18, 2021
Messages
2,582 (2.00/day)
4K displays above 60Hz use compression to achieve those refresh rates, or they lower from RGB/4:4:4 to 4:2:0 (HDR does this too)

If only we had new standards that allowed this... They've been out for only about 5 years, maybe another 5 and this stops being a problem :shadedshu:

HDMI is the worst of the 2 with the crap they pulled calling anything 2.1, not that it changed much, just allowing for anything other than FRL6 was already completely stupid, TMDS just added insult to injury
 
Top