• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sparkle Arc A580 Orc

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
29,010 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Sparkle's Arc A580 Orc is a compact dual-slot graphics card that will fit into all cases and offers performance that handles all titles at Full HD 1080p. The card comes with a nifty RGB-lighting temperature monitoring feature that doesn't require any additional software.

Show full review
 
6600 like performance for 6600 like money nice only about 2 years too late intel.

Awesome Job as always W1z these GPUs are likely interesting to bench regardless.
 
A new RX 6600 costs $210, my price listed is used, because the card is so old

True, I see them pop up frequently new under 200 though. I can grab one right now for 202 but have seen 180-190 frequently.

For me that would still be a toss up I would probably recommend the 6600 to people will less tinkering experience.

But you are the man when it comes to all that I would definitely defer to you I did see the positive points just not sure most in the 200 price range care about that stuff.
 
just not sure most in the 200 price range care about that stuff.
You are absolutely correct, price matters A LOT in this segment, that's why I think we'll see $150 for A580 soon, as mentioned in the conclusion
 
Wow the cooler of this card is comically bad.
Performance is quite nice for the price, but that idle power consumption is honestly killing it.
I don't see why anyone should consider this card over the 6600 unless they really want to support Intel for some reason.
Or AV1 encoding I guess.

I'm just a bit baffled by Intel's pricing strategy.
Most newcomers try to aggressively price their cards to get a userbase, but Intel seems to not care at all and seems content to not sell much.
It's almost as if they have some kind of commitment to produce the cards but have given up on this generation regarding sales from the very start.
 
Is this the same Sparkle company that made power supplies back in the day? Impressive RT performance but performance per watt is pretty bad.
 
See the 256 bit memory bus at under $200, @Nvidia?
 
Is this the same Sparkle company that made power supplies back in the day? Impressive RT performance but performance per watt is pretty bad.

pretty sure it's the same Sparkle who made gpu's in general back in the day.




See the 256 bit memory bus at under $200, @Nvidia?

I like big bus's and I can not lie just not on GPUs this weak.....
 
I like big bus's and I can not lie just not on GPUs this weak.....
Even if you don't need it on this one, see how it closes some of the gap to 4060 at 4k? Clearly 128 bits is not enough there.
 
Even if you don't need it on this one, see how it closes some of the gap to 4060 at 4k? Clearly 128 bits is not enough there.

Sure, glad this 180 gpu is a 4k marvel at under 200 usd....... Although I would still take the 18% the 4060 beats this on average and DLSS at 4k any day but likely both gpu's are going to run out of Vram in modern games a long time before that matters.

Lets be real this card struggle in some games at 1080p as it is or maybe that is intels driver in some games who knows.

Good thing for Intel the 6600 seems to be drying up not a ton of models left and pricing is slowly creeping up and 3050 was always trash meaning the competition isn't much better.
 
Sure, glad this 180 gpu is a 4k marvel at under 200 usd....... Although I would still take the 18% the 4060 beats this on average and DLSS at 4k any day but likely both gpu's are going to run out of Vram in modern games a long time before that matters.

Lets be real this card struggle in some games at 1080p as it is or maybe that is intels driver in some games who knows.

Good thing for Intel the 6600 seems to be drying up not a ton of models left and pricing is slowly creeping up and 3050 was always trash meaning the competition isn't much better.
Oh, don't pretend you don't get it: 4060(Ti) is clearly held back by the 128 bit memory bus. Sure, it won't play the latest games maxed out at 4k, but a 256 bit bus would have given it a chance some of the older games at that resolution.
 
The Radeon 7600 idles at 2W but this card idles over 40W. How did this get released?
 
Oh, don't pretend you don't get it: 4060(Ti) is clearly held back by the 128 bit memory bus. Sure, it won't play the latest games maxed out at 4k, but a 256 bit bus would have given it a chance some of the older games at that resolution.

Not disagreeing with you but neither card is a 4k gaming card if someone wants to use it that way good for them. Nvidia's choice of cutting down gpus is crap but this card specifically isn't a ringing endorsement for why they should have went with a bigger bus now if this card outperformed them while using less power and didn't require resizable bar to even function sure.
 
Relatively poor performance at 1080p and 1440p makes this a bad choice for most people looking at a budget GPU for use with a 1080p or 1440p monitor.

Where it is going to perform better than the RX 6600/6600XT/7600 is people hooking this up to a 4K TV. Running at 4K with upscaling will suffer less on this card than the AMD competition, and Nvidia aren't even really competing at this level. Their closest offering is the 3050 which isn't even in the same ballpark most of the time, costing 40% more per frame than even the RX6600 - which this Arc 580 seems to improve on.
 
I dont get this card, its priced too high, performs too low and it has ridiculous idle power consumption which...we already knew from the a750/a770....but why not then ffing fix that in this much later released version and if its that much part of this architecture that it cant be fixed...then dont release this card, improve the architecture.....battlemage when?
 
ASRock has obliterated Sparkle's coolers on the Arc cards. They might not look quite as good but it's hard to argue with that cooler comparison.

Though TUL in my opinion, has consistently been a bit behind on cooler design/efficiency for a couple generations now.
 
Lol. At this power consumption, and considering it's name (yeah yeah D&D) it should have been red, to make it "fastah"
waaaagon-warhammer40k.gif
 
Oh, don't pretend you don't get it: 4060(Ti) is clearly held back by the 128 bit memory bus. Sure, it won't play the latest games maxed out at 4k, but a 256 bit bus would have given it a chance some of the older games at that resolution.
While true, perhaps the 4060(ti) should have just had GDDR6X instead of a wider bus. I mean, its not like we've ever seen 256bit on this tier before, that'd be pretty wild in todays' GPU balance. The top end isn't endowed with 512 bit, but 384 + GDDR6X. A more sane width would have been 192 bit; or GDDR6X. The 4060ti is shit because of lacking both.

I dont get this card, its priced too high, performs too low and it has ridiculous idle power consumption which...we already knew from the a750/a770....but why not then ffing fix that in this much later released version and if its that much part of this architecture that it cant be fixed...then dont release this card, improve the architecture.....battlemage when?
In the end its a GPU that plays games and might nudge some over to Intel's measly market share. Intel wants volume over margin now, and for good reasons. That said, there isn't exactly volume either, but still, the potential for sales is MUCH higher for Intel on the low/mid range. People that are just looking for a cheap GPU to play games are going to draw to this, especially given the price point.
 
RTX 4060/4060T Ti are put in the wrong boxes, and therefore the confusion it creates in the users.

GTS 250: G92B 256-bit 200$ 260 mm^2 2009
GTS 450: GF106 128-bit 130$ 238 mm^2 2010
GTX 550 Ti: GF116 192-bit 150$ 238 mm^2 2011
GTX 650: GK107 128-bit 110$ 118 mm^2 2012
GTX 750: GM107 128-bit 120$ 148 mm^2 2014
GTX 950: GM206 128-bit 160$ 228 mm^2 2015
GTX 1050: GP107 128-bit 110$ 132 mm^2 2016
GTX 1650: GP117 128-bit 150$ 200 mm^2 2019
RTX 3050: GA106 128-bit 250$ 276 mm^2 2022

RTX 4060: AD107 128-bit 300$ 159 mm^2 2023
RTX 4060 Ti: AD106 128-bit 400$ 188 mm^2 2023
 
For me that would still be a toss up I would probably recommend the 6600 to people will less tinkering experience.

It is not a toss up. The ORC consumes 42 Watt in idle when the RX 6600 consumes 3 Watt in idle.
Even my 11 years old GTX 650 defeats the ORC with a large margin.
It uses only 7 Watt in idle or 35 Watt better than the ORC card.
 
It is not a toss up. The ORC consumes 42 Watt in idle when the RX 6600 consumes 3 Watt in idle.
Even my 11 years old GTX 650 defeats the ORC with a large margin.
It uses only 7 Watt in idle or 35 Watt better than the ORC card.

I don't know how intel intends to sell their shiete in the european union with its so strict, even fanatic energy consumption rules.
In all honesty, this terrible idle consumption is a cause for a class-action lawsuit against intel, stopping its products from entering the market, and imposing hefty sanctions.

Send an official complaint against intel:
 
Not disagreeing with you but neither card is a 4k gaming card if someone wants to use it that way good for them. Nvidia's choice of cutting down gpus is crap but this card specifically isn't a ringing endorsement for why they should have went with a bigger bus now if this card outperformed them while using less power and didn't require resizable bar to even function sure.
I was actually going to pull the trigger on one and leave my trusted 1060 behind. But then I thought "$400 to get me a 128 bit bus? this ain't gonna age well", so I didn't.
As @Vayra86 hinted, 256 bit buses aren't common in this segment. But I did have that on my 460. After that I had the 660Ti and this 1060, both with 192 bit memory bus. Paying twice what I used to and getting... that, just doesn't make sense.
 
Intel Arc isn't just good for gaming, it could also be used for more professional workloads like GPU-accelerated rendering and AI, like Stable Diffusion. Here the A580 can make a difference, thanks to having a lot of memory bandwidth available. Another use case is encoding, especially due to AV1. The Arc A580 is the most affordable GPU with hardware-acceleration for AV1 video encode.

Isn't the A380 the most affordable one for AV1 encode?
 
Isn't the A380 the most affordable one for AV1 encode?

Sparkle A750 is $189 right now on amazon if someones going intel that's probably the better option in the US anyways.
 
Back
Top