• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASRock Arc A580 Challenger

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,425 (3.80/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling Noctua NH-U14S DX-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) Odyssey OLED G9 (G95SC)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Modi+ & Valhalla 2
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.

susie1818

New Member
Joined
Oct 11, 2023
Messages
3 (0.01/day)
To TechPowerUp staff:

Do you guys know that intel Arc GPUs support ASPM L1 idle power saving mode? Your idle power consumption tests apparently didn't take advantage of this feature.
 

susie1818

New Member
Joined
Oct 11, 2023
Messages
3 (0.01/day)
A lot of motherboards don't expose those settings
It would have been better if they could mention this feature in their articles and give test results from the respective configurations with and without the feature being used.

Otherwise I would assume that they didn't know the fact and were not professional enough.

Not all of them do
At least intel's own reference card A770 LE and A750 LE do, but it has not been shown in the comparison chart.

If TPU did test this aspect, it would have helped people know whether this ASRock varient support this feature or not.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,453 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
On which univers does the writer of this review finds 1080ti at 170 dollars???

So now we compare prices with second hand market. Alrighty...
Correct. That's what I do for old SKUs, because they are end-of-life and not produced anymore. They are still relevant for the considerations in this review, at least in my opinion. If you disagree, and only want to buy new, ignore everything relating to these cards

At least intel's own reference card A770 LE and A750 LE do, but it has not been shown in the comparison chart.

If TPU did test this aspect, it would have helped people know whether this ASRock varient support this feature or not.
I would assume they do, it's a GPU hardware/driver feature. 99.999% of people have no way to control this or know about it, so it's similar to "why don't you test more undervolting".
 
Joined
Dec 1, 2020
Messages
399 (0.29/day)
Processor Ryzen 5 7600X
Motherboard ASRock B650M PG Riptide
Cooling Noctua NH-D15
Memory DDR5 6000Mhz CL28 32GB
Video Card(s) Nvidia Geforce RTX 3070 Palit GamingPro OC
Storage Corsair MP600 Force Series Gen.4 1TB
On which univers does the writer of this review finds 1080ti at 170 dollars???
3070 is 50% faster, support DLSS and RT, its 4 years newer and cost around $270 so why not? $170 for 7 year old amortized card is terrible idea, just pay 60 more and get brand new 7600, 20% faster with half the power consumption
 
Joined
Sep 1, 2020
Messages
2,153 (1.48/day)
Location
Bulgaria
I would assume they do, it's a GPU hardware/driver feature. 99.999% of people have no way to control this or know about it, so it's similar to "why don't you test more undervolting".
People can learn and are curious. You could write about this feature and then the percentage of people who don't know about it or don't use it will decrease. ;)
 
Joined
Apr 29, 2020
Messages
137 (0.09/day)
Intel must be sinking so much cost for the card OEMs. Using a 396mm2 die fabbed at 6nm (even with harvesting) and a 256b memory bus to complete with a ~$190 6600 that uses a 237mm2 die at 7nm and a 128b memory bus, seems like a good recipe for losing a lot of money per unit, which I imagine Intel is absorbing (no OEM would manufacture the cards if they lost money on them).

It does make we wonder how long Intel will be willing to continue doing that, when they seem intent of cutting every other non-core division.

On the performance per $ graphs, why are you also showing the 580 at $110, $130 and $150? The narrative makes no mention of these, and I'm not sure what they are intended to show, other than the obvious 'if the card was cheaper than it is then it would offer better value".
 

Lynxx83

New Member
Joined
Aug 9, 2023
Messages
6 (0.02/day)
To TechPowerUp staff:

Do you guys know that intel Arc GPUs support ASPM L1 idle power saving mode? Your idle power consumption tests apparently didn't take advantage of this feature.
it's interesting to see almost every review across the net is testing Arc mainly/only on boards with rBAR on, obviously, because you really need it,
for max performance, it's 'mandatory' so to speak,

but at the same time they always test power draw / idle consumption without ASPM / L1 disabled.

very weired:

pcgh arc idle.JPG


on my A770 16GB LE, it looks like this:

arc a770 idle verbrauch gpu-z aspm on.gif

I have to say though, above ~5W with APSM L1,
vs the usually reported/measured ~40W without,

is in my case only for 100% idle, meaning not moving any window, or scrolling next, yes not even moving your mouse pointer at all.

especially power consumption in "light load" scenarios is still far from optimal,

and it's weired to see srolling some text, or moving your firefox / ms edge window,
with above link opened to PCGHs* own review of the A580,

will cost almost as much power as watching/decoding a 4K/UHD 60fps youtube video :)

*(it's in german, and besides techpowerup one of the first sites to have already published reviews for intels new card so far)

but, it's also not always as bad as it seems to be in almost every review, depending on the situation,

and more importantly, it's also not that hard to flip a single switch or two, in your uefi bios & in windows.

btw: yes, it's valid criticising Intel for it's bad power saving optimizations, and there are people who dont feel fit or just dont want to 'mess' with their bios, even if its not really rocket science, not at all..

yet again, on the other hand, ASPM is no new or exotic thing, it exists since a very long time now and can have benefits / positive effects for some of the rest of your other hardware too, if you want to save maximum power : it's been there for quite a while,
only AMD didn't have officially for their AM4 platform at first..

and once it was patched into /unlocked in one of the AGESA updates way back, only few vendors cared to ever implement/unlock it,
especially MSI was reluctant/ignoring customer feedback for a long time and incapable of implementing it until recently, past summer, when even MSI started to finally roll out APSM on some AM4 boards.
only gigabyte had it right from the start afaik, on most of it's lineup, eg. on x570 based boards etc.

so , regarding power consumption and driver stability for Arc dGPUs, not only windows but even on linux,
imho it could be way worse, for their first try (in decades)..

and have you ever heard of AMDs power consumption problems with certain high res/refresh rate monitors on its 7000 series?

or is there anybody who remembers how bad ATI/AMD Radeon drivers were for linux, a long time ago ? ;)

PS: I also had a RX 6700 XT and RTX 3060Ti FE, and I DO NOT have more or less crashes with the Arc and I had before I'd say,

not that I care too much, but I do like the Arcs visual design, it's a fresh and welcome contrast to the usual gam0r led bling rough edged designs,

and unlike the rx 6700 / rtx 3060ti I had,
I noticed zero coil whining so far on the Arc.

but tbh, I simply bought the Arc because I was bored & curious enough and I love new / exotic stuff; I didn't expect to experience zero "deal breaking" issues or " game breaking" problems.

(those, I had only a few, fortunately, like broken FSR in FH5, Starfield not launching/bad perfomance, and more recently, The Crew Motorfest always crashing at the same spot ingame.. pretty sure not all problems are even caused by Arc/drivers, might have been game developers 'oversight' as well sometimes - but most of the time it got sorted/solved after a few weeks / months with new driver updates)

so far my Arc experience has been a pretty pleasant & exciting one.
 
Joined
Dec 15, 2022
Messages
26 (0.04/day)
on my A770 16GB LE, it looks like this:

View attachment 317190

I have to say though, above ~5W with APSM L1,
vs the usually reported/measured ~40W without,
From power testing at the wall, the GPU with ASPM on and displaying ~5-10w, the card is still eating ~40w.

System without A770 using iGPU pulls ~35w at idle.
Add Arc with ASPM working and 5-10w reported in windows and its idling at 70-80w.

The metric the driver is showing is power for the GPU chip only, not the whole card. Meaning that Arc is reporting power like older AMD GPUs. Easiest way to get close to a fully instrumented card for power readings is to add roughly 30W to whatever the software monitor shows you. The other hint that this is the case is the stock power limit, 190w in software on a card officially specced at 225w.
 

susie1818

New Member
Joined
Oct 11, 2023
Messages
3 (0.01/day)
From power testing at the wall, the GPU with ASPM on and displaying ~5-10w, the card is still eating ~40w.

System without A770 using iGPU pulls ~35w at idle.
Add Arc with ASPM working and 5-10w reported in windows and its idling at 70-80w.

The metric the driver is showing is power for the GPU chip only, not the whole card. Meaning that Arc is reporting power like older AMD GPUs. Easiest way to get close to a fully instrumented card for power readings is to add roughly 30W to whatever the software monitor shows you. The other hint that this is the case is the stock power limit, 190w in software on a card officially specced at 225w.
Doesn't sound logic to me. If this logic is correct, then it would have been 70W idle power consumption when ASPM L1 is not engaged, but the fact is that TechPowerUp has used lab equipment, not by using software readings, to physically measure the total power consumption through the PCI-E power connectors plus the PCI-E bus power lanes, and they actually got the figure of 37 Watts idle.

The 35W (225-190) that should be added to the "GPU chip power draw" readout is under the condition of full load. When idle, VRAM and VRM are not supposed to dissipate that much power.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,453 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
On the performance per $ graphs, why are you also showing the 580 at $110, $130 and $150? The narrative makes no mention of these, and I'm not sure what they are intended to show, other than the obvious 'if the card was cheaper than it is then it would offer better value".
"if the card was cheaper than it is, how much better value would it offer?"
 
Joined
Oct 6, 2021
Messages
1,604 (1.52/day)
"if the card was cheaper than it is, how much better value would it offer?"
"If the card were cheaper than it is, how much better value would it offer?"

You're all wrong, folks. :p

The question I would ask would be this: If Intel supposedly sold this at a loss now, how would it be able to overcome the unbeatable cost x performance of a product being sold at a loss in the next generation? They can't, it would be stupid. They need to enhance GPU design and achieve a better performance/area. And get rid of AI-dedicated junk in gaming GPUs, especially entry-level models :rolleyes:
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,453 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
If Intel supposedly sold this at a loss now
Which may or may not be true. Nobody outside of Intel really knows. and operating at a loss to achieve something else is a reasonable business strategy for a lot of companies

They need to enhance [...] design and achieve a better performance/area
I think that is universally true for microprocessor design and fabrication
 
Joined
Dec 15, 2022
Messages
26 (0.04/day)
Doesn't sound logic to me. If this logic is correct, then it would have been 70W idle power consumption when ASPM L1 is not engaged, but the fact is that TechPowerUp has used lab equipment, not by using software readings, to physically measure the total power consumption through the PCI-E power connectors plus the PCI-E bus power lanes, and they actually got the figure of 37 Watts idle.

The 35W (225-190) that should be added to the "GPU chip power draw" readout is under the condition of full load. When idle, VRAM and VRM are not supposed to dissipate that much power.
We have seen ~70w idle with ASPM off (100-110w at the wall on the 35w baseline system), so its quite likely that in the review power management was working as intended.

Its not just VRM/VRAM, AFAIK there is also a misc voltage rail for some of the IO on the chip as well.
 
Last edited:
Joined
Jan 27, 2015
Messages
1,688 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
It's kind of astonishing that they can sell a card with a 406 mm2 die and a 256-bit bus for less than $200 and still make a profit, even if on an older process. NVIDIA and AMD must be swimming in margins.

This is actually a very good card, they have improved the drivers so much. The power draw is horrible, though. Half the performance of a 4070 while drawing more power. It's still a tolerable amount of heat output, but just barely.

It seems that the architecture is just extremely inefficient. The RX 7600 has the same performance as a A770 at half the die size on the same node. That's insane.

What could they achieve with Battlemage? To compete with the upper mid-range they would probably have to get close to 300 W, and for me that's unacceptable no matter how good the price was.

I do hope they stick with it, but I fear they will only be able to compete below my preferred tier of performance.

Arc has always been like that. If you look purely at hardware specs, they should be batting much higher than they are. The A770 should have been a 3090 competitor, and this A580 should have been a 3060 Ti competitor.

Honestly it is very clear to me that Arc is a good piece of hardware, that's not their issue. I see that because, in many instances, it rips AMD / Nvidia up at its price point. Then ofc, it will fail at something different.

Many call it 'inconsistent', which is accurate, but I primarily see 'immature \ inefficient drivers'. As long as Intel keeps chugging at improving those drivers, and assuming they can keep up on the hardware side at the same time, that will change. I'd give it about 2 more years before they are serious competition to Nvidia and AMD.
 
Joined
Jul 5, 2013
Messages
26,245 (6.45/day)
Considering the attractive price of just $180, performance is good, too.
Another good review. Your attention to detail on this one was excellent! Also agree with the statement that it's a good value at it's current price-point. As far as the cons go, I'm one of those people that prefer the fans be on all the time, even if at minimal speeds. It's a cooling-peace-of-mind thing.

So now we compare prices with second hand market. Alrighty...
When has that never been an option?

I mean where are you going to get a 1080ti) new?
Right?
 
Joined
Jan 15, 2021
Messages
337 (0.26/day)
Another good review. Your attention to detail on this one was excellent! Also agree with the statement that it's a good value at it's current price-point. As far as the cons go, I'm one of those people that prefer the fans be on all the time, even if at minimal speeds. It's a cooling-peace-of-mind thing.


When has that never been an option?


Right?
Considering the card is not sold anymore I realized my comment was super dumb.
 
Joined
Jul 5, 2013
Messages
26,245 (6.45/day)
Considering the card is not sold anymore I realized my comment was super dumb.
I wouldn't say "super dumb". It's just that the used market is always an option and part of the buying pool for most people. Comparing only to the new card market ignores a large portion of peoples options.
 
Joined
Jan 20, 2019
Messages
1,414 (0.69/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 10 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
@W1zzard the last 5 GPU reviews (or more) are missing the 7700XT in the performance charts. Can this be amended.
 
Joined
Jan 20, 2019
Messages
1,414 (0.69/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 10 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
The point of those omissions, I suspect, is to both motivate the user to read other reviews and to save some time for the reviewer.

I still fancy seeing all the current Gen GPUs from NV, AMD and INTEL reflected in each reviews benchmark performance charts which i believe has been the customary approach with TPU reviews.
 
Top