• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ASRock Arc A580 Challenger

To TechPowerUp staff:

Do you guys know that intel Arc GPUs support ASPM L1 idle power saving mode? Your idle power consumption tests apparently didn't take advantage of this feature.
 
A lot of motherboards don't expose those settings
It would have been better if they could mention this feature in their articles and give test results from the respective configurations with and without the feature being used.

Otherwise I would assume that they didn't know the fact and were not professional enough.

Not all of them do
At least intel's own reference card A770 LE and A750 LE do, but it has not been shown in the comparison chart.

If TPU did test this aspect, it would have helped people know whether this ASRock varient support this feature or not.
 
On which univers does the writer of this review finds 1080ti at 170 dollars???

So now we compare prices with second hand market. Alrighty...
Correct. That's what I do for old SKUs, because they are end-of-life and not produced anymore. They are still relevant for the considerations in this review, at least in my opinion. If you disagree, and only want to buy new, ignore everything relating to these cards

At least intel's own reference card A770 LE and A750 LE do, but it has not been shown in the comparison chart.

If TPU did test this aspect, it would have helped people know whether this ASRock varient support this feature or not.
I would assume they do, it's a GPU hardware/driver feature. 99.999% of people have no way to control this or know about it, so it's similar to "why don't you test more undervolting".
 
On which univers does the writer of this review finds 1080ti at 170 dollars???
3070 is 50% faster, support DLSS and RT, its 4 years newer and cost around $270 so why not? $170 for 7 year old amortized card is terrible idea, just pay 60 more and get brand new 7600, 20% faster with half the power consumption
 
I would assume they do, it's a GPU hardware/driver feature. 99.999% of people have no way to control this or know about it, so it's similar to "why don't you test more undervolting".
People can learn and are curious. You could write about this feature and then the percentage of people who don't know about it or don't use it will decrease. ;)
 
Intel must be sinking so much cost for the card OEMs. Using a 396mm2 die fabbed at 6nm (even with harvesting) and a 256b memory bus to complete with a ~$190 6600 that uses a 237mm2 die at 7nm and a 128b memory bus, seems like a good recipe for losing a lot of money per unit, which I imagine Intel is absorbing (no OEM would manufacture the cards if they lost money on them).

It does make we wonder how long Intel will be willing to continue doing that, when they seem intent of cutting every other non-core division.

On the performance per $ graphs, why are you also showing the 580 at $110, $130 and $150? The narrative makes no mention of these, and I'm not sure what they are intended to show, other than the obvious 'if the card was cheaper than it is then it would offer better value".
 
To TechPowerUp staff:

Do you guys know that intel Arc GPUs support ASPM L1 idle power saving mode? Your idle power consumption tests apparently didn't take advantage of this feature.
it's interesting to see almost every review across the net is testing Arc mainly/only on boards with rBAR on, obviously, because you really need it,
for max performance, it's 'mandatory' so to speak,

but at the same time they always test power draw / idle consumption without ASPM / L1 disabled.

very weired:

pcgh arc idle.JPG


on my A770 16GB LE, it looks like this:

arc a770 idle verbrauch gpu-z aspm on.gif

I have to say though, above ~5W with APSM L1,
vs the usually reported/measured ~40W without,

is in my case only for 100% idle, meaning not moving any window, or scrolling next, yes not even moving your mouse pointer at all.

especially power consumption in "light load" scenarios is still far from optimal,

and it's weired to see srolling some text, or moving your firefox / ms edge window,
with above link opened to PCGHs* own review of the A580,

will cost almost as much power as watching/decoding a 4K/UHD 60fps youtube video :)

*(it's in german, and besides techpowerup one of the first sites to have already published reviews for intels new card so far)

but, it's also not always as bad as it seems to be in almost every review, depending on the situation,

and more importantly, it's also not that hard to flip a single switch or two, in your uefi bios & in windows.

btw: yes, it's valid criticising Intel for it's bad power saving optimizations, and there are people who dont feel fit or just dont want to 'mess' with their bios, even if its not really rocket science, not at all..

yet again, on the other hand, ASPM is no new or exotic thing, it exists since a very long time now and can have benefits / positive effects for some of the rest of your other hardware too, if you want to save maximum power : it's been there for quite a while,
only AMD didn't have officially for their AM4 platform at first..

and once it was patched into /unlocked in one of the AGESA updates way back, only few vendors cared to ever implement/unlock it,
especially MSI was reluctant/ignoring customer feedback for a long time and incapable of implementing it until recently, past summer, when even MSI started to finally roll out APSM on some AM4 boards.
only gigabyte had it right from the start afaik, on most of it's lineup, eg. on x570 based boards etc.

so , regarding power consumption and driver stability for Arc dGPUs, not only windows but even on linux,
imho it could be way worse, for their first try (in decades)..

and have you ever heard of AMDs power consumption problems with certain high res/refresh rate monitors on its 7000 series?

or is there anybody who remembers how bad ATI/AMD Radeon drivers were for linux, a long time ago ? ;)

PS: I also had a RX 6700 XT and RTX 3060Ti FE, and I DO NOT have more or less crashes with the Arc and I had before I'd say,

not that I care too much, but I do like the Arcs visual design, it's a fresh and welcome contrast to the usual gam0r led bling rough edged designs,

and unlike the rx 6700 / rtx 3060ti I had,
I noticed zero coil whining so far on the Arc.

but tbh, I simply bought the Arc because I was bored & curious enough and I love new / exotic stuff; I didn't expect to experience zero "deal breaking" issues or " game breaking" problems.

(those, I had only a few, fortunately, like broken FSR in FH5, Starfield not launching/bad perfomance, and more recently, The Crew Motorfest always crashing at the same spot ingame.. pretty sure not all problems are even caused by Arc/drivers, might have been game developers 'oversight' as well sometimes - but most of the time it got sorted/solved after a few weeks / months with new driver updates)

so far my Arc experience has been a pretty pleasant & exciting one.
 
on my A770 16GB LE, it looks like this:

View attachment 317190

I have to say though, above ~5W with APSM L1,
vs the usually reported/measured ~40W without,
From power testing at the wall, the GPU with ASPM on and displaying ~5-10w, the card is still eating ~40w.

System without A770 using iGPU pulls ~35w at idle.
Add Arc with ASPM working and 5-10w reported in windows and its idling at 70-80w.

The metric the driver is showing is power for the GPU chip only, not the whole card. Meaning that Arc is reporting power like older AMD GPUs. Easiest way to get close to a fully instrumented card for power readings is to add roughly 30W to whatever the software monitor shows you. The other hint that this is the case is the stock power limit, 190w in software on a card officially specced at 225w.
 
From power testing at the wall, the GPU with ASPM on and displaying ~5-10w, the card is still eating ~40w.

System without A770 using iGPU pulls ~35w at idle.
Add Arc with ASPM working and 5-10w reported in windows and its idling at 70-80w.

The metric the driver is showing is power for the GPU chip only, not the whole card. Meaning that Arc is reporting power like older AMD GPUs. Easiest way to get close to a fully instrumented card for power readings is to add roughly 30W to whatever the software monitor shows you. The other hint that this is the case is the stock power limit, 190w in software on a card officially specced at 225w.
Doesn't sound logic to me. If this logic is correct, then it would have been 70W idle power consumption when ASPM L1 is not engaged, but the fact is that TechPowerUp has used lab equipment, not by using software readings, to physically measure the total power consumption through the PCI-E power connectors plus the PCI-E bus power lanes, and they actually got the figure of 37 Watts idle.

The 35W (225-190) that should be added to the "GPU chip power draw" readout is under the condition of full load. When idle, VRAM and VRM are not supposed to dissipate that much power.
 
On the performance per $ graphs, why are you also showing the 580 at $110, $130 and $150? The narrative makes no mention of these, and I'm not sure what they are intended to show, other than the obvious 'if the card was cheaper than it is then it would offer better value".
"if the card was cheaper than it is, how much better value would it offer?"
 
"if the card was cheaper than it is, how much better value would it offer?"
"If the card were cheaper than it is, how much better value would it offer?"

You're all wrong, folks. :p

The question I would ask would be this: If Intel supposedly sold this at a loss now, how would it be able to overcome the unbeatable cost x performance of a product being sold at a loss in the next generation? They can't, it would be stupid. They need to enhance GPU design and achieve a better performance/area. And get rid of AI-dedicated junk in gaming GPUs, especially entry-level models :rolleyes:
 
If Intel supposedly sold this at a loss now
Which may or may not be true. Nobody outside of Intel really knows. and operating at a loss to achieve something else is a reasonable business strategy for a lot of companies

They need to enhance [...] design and achieve a better performance/area
I think that is universally true for microprocessor design and fabrication
 
Doesn't sound logic to me. If this logic is correct, then it would have been 70W idle power consumption when ASPM L1 is not engaged, but the fact is that TechPowerUp has used lab equipment, not by using software readings, to physically measure the total power consumption through the PCI-E power connectors plus the PCI-E bus power lanes, and they actually got the figure of 37 Watts idle.

The 35W (225-190) that should be added to the "GPU chip power draw" readout is under the condition of full load. When idle, VRAM and VRM are not supposed to dissipate that much power.
We have seen ~70w idle with ASPM off (100-110w at the wall on the 35w baseline system), so its quite likely that in the review power management was working as intended.

Its not just VRM/VRAM, AFAIK there is also a misc voltage rail for some of the IO on the chip as well.
 
Last edited:
It's kind of astonishing that they can sell a card with a 406 mm2 die and a 256-bit bus for less than $200 and still make a profit, even if on an older process. NVIDIA and AMD must be swimming in margins.

This is actually a very good card, they have improved the drivers so much. The power draw is horrible, though. Half the performance of a 4070 while drawing more power. It's still a tolerable amount of heat output, but just barely.

It seems that the architecture is just extremely inefficient. The RX 7600 has the same performance as a A770 at half the die size on the same node. That's insane.

What could they achieve with Battlemage? To compete with the upper mid-range they would probably have to get close to 300 W, and for me that's unacceptable no matter how good the price was.

I do hope they stick with it, but I fear they will only be able to compete below my preferred tier of performance.

Arc has always been like that. If you look purely at hardware specs, they should be batting much higher than they are. The A770 should have been a 3090 competitor, and this A580 should have been a 3060 Ti competitor.

Honestly it is very clear to me that Arc is a good piece of hardware, that's not their issue. I see that because, in many instances, it rips AMD / Nvidia up at its price point. Then ofc, it will fail at something different.

Many call it 'inconsistent', which is accurate, but I primarily see 'immature \ inefficient drivers'. As long as Intel keeps chugging at improving those drivers, and assuming they can keep up on the hardware side at the same time, that will change. I'd give it about 2 more years before they are serious competition to Nvidia and AMD.
 
Considering the attractive price of just $180, performance is good, too.
Another good review. Your attention to detail on this one was excellent! Also agree with the statement that it's a good value at it's current price-point. As far as the cons go, I'm one of those people that prefer the fans be on all the time, even if at minimal speeds. It's a cooling-peace-of-mind thing.

So now we compare prices with second hand market. Alrighty...
When has that never been an option?

I mean where are you going to get a 1080ti) new?
Right?
 
Another good review. Your attention to detail on this one was excellent! Also agree with the statement that it's a good value at it's current price-point. As far as the cons go, I'm one of those people that prefer the fans be on all the time, even if at minimal speeds. It's a cooling-peace-of-mind thing.


When has that never been an option?


Right?
Considering the card is not sold anymore I realized my comment was super dumb.
 
Considering the card is not sold anymore I realized my comment was super dumb.
I wouldn't say "super dumb". It's just that the used market is always an option and part of the buying pool for most people. Comparing only to the new card market ignores a large portion of peoples options.
 
@W1zzard the last 5 GPU reviews (or more) are missing the 7700XT in the performance charts. Can this be amended.
 
The point of those omissions, I suspect, is to both motivate the user to read other reviews and to save some time for the reviewer.

I still fancy seeing all the current Gen GPUs from NV, AMD and INTEL reflected in each reviews benchmark performance charts which i believe has been the customary approach with TPU reviews.
 
Back
Top