• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sparkle Arc A770 ROC

I might be wrong, but it's my belief that the Sparkle is the cheapest of the A770's and least performant.
The others run higher clocked.
Personally I prefer the Intel design, not the partners.
 
That DLSS remark as a con even with the qualifier is not fair unless you remark on every nvidia and amd review their lack of XeSS and or FSR (which of course work on other cards). DLSS is specifically an nvidia tech. You cant blame and non nvidia card for not having DLSS.
Past reviews said ‘Not as good as DLSS’ but I guess saying what isn’t possible is somehow better. Who knew?
 
Not future-proof card at all ... i think in 2 or 3 years you can't even play 1080p with this card :(
 
Just make a list from where every manufacturer comes from.
 
Not future-proof card at all ... i think in 2 or 3 years you can't even play 1080p with this card :(
By that logic NONE of the cards in that price bracket are “future-proof”. Not to mention that performance future-proofing in GPUs is mostly a dead meme, unless we’re talking insane halo cards like the 4090 that can be relevant across multiple generations.

Just make a list from where every manufacturer comes from.
It will all be Taiwan and China. Pretty much the only exception would be PNY and… that’s it? I think?
 
This should be compared to the 7600XT. Not the 7600. Especially if you are using the VRAM argument to justify getting one.
 
I keep running the A770 against my venerable RX590, in the comparator sites, but the performance increase isn't enough to make me drop the cash on one. I mostly play older games anyway, but I would like to update at some point. I've got a working R9 290 in an older Linux rig still running fine.
 
This card should have been priced at 199$ including the taxes, to have any meaningful value.
Otherwise, there is no point of paying a little more for an RTX 4060 who can also have DLSS with Frame Generation.
I mean, is a non brainer....
 
Past reviews said ‘Not as good as DLSS’ but I guess saying what isn’t possible is somehow better. Who knew?
Ah yes, I remember using that wording, does it matter that much? All I want is you to think "oh, no DLSS? What does this mean? Oh no framegen either? Hmm .. does this matter to me? Yes or no"
 
By that logic NONE of the cards in that price bracket are “future-proof”. Not to mention that performance future-proofing in GPUs is mostly a dead meme, unless we’re talking insane halo cards like the 4090 that can be relevant across multiple generations.

Sorry but in 1080p ... you don't need High-end GPUs to play for many years !
 
@mamide
…okay, then explain why the A770 is any less “future proof” in your understanding compared to its direct rivals - the 7600 and the 4060. I must not be understanding something here. What are we comparing here against?
 
This card should have been priced at 199$ including the taxes, to have any meaningful value.
Otherwise, there is no point of paying a little more for an RTX 4060 who can also have DLSS with Frame Generation.
I mean, is a non brainer....

dont this price but now in newegg stay in 229us for 16gb model :twitch:


@W1zzard

Good test however maybe can try test dxvk vs dx11 and vkd3d vs dx12 because intel gpu work better using vulkan than dx

:)
 
Curious to see if Battlemage is Intel's last GPU. Cutbacks and layoffs at Intel after their disastrous 13th/14th gen issues and underwhelming Core Ultra launch mean that the future of their dedicated graphics cards was looking pretty uncertain a month ago.
 
They are playing the long game.

Their main issue is the drivers, specifically that AMD and Nvidia drivers pretty much have optimizations for every game ever. Another aspect is that Nvidia and AMD have a crew of driver coders with vast experience, while Intel is building that out from mostly scratch.

The hardware here should be competitive with the 4070 and 6700 XT. There are only a few games where this really shows, but it is a feat that never happens with for example a 6600 XT or 3060 Ti.

What this review mostly shows is where they are in that driver and driver team build-out. Looks like at least a couple more years to go.
This is utterly irrelevant, if people look at and decide for about the same price, they'd rather play it safe with Nvidia (or even AMD, but this is AMD's problem as well).
 
I might be wrong, but it's my belief that the Sparkle is the cheapest of the A770's and least performant.
You'd be wrong, on both points(no offense intended). They are the most expensive currently and the performance is nearly identical. Unless by "cheap" you meant lower quality. That's a bit subjective but I would not disagree. This is where AsRock also gets a win, IMPO.

Personally I prefer the Intel design, not the partners.
That would not be unwise. Intel's offerings seem to be very good. And they look good.

"oh, no DLSS? What does this mean? Oh no framegen either? Hmm .. does this matter to me? Yes or no"
Some of us don't use either, so it doesn't factor in.
 
Lets be fair, this A770 is competing with the 3060ti and 4060
That's an optimistic interpretation of the results. Its performance averages closer to the regular 3060 than to the 3060 Ti.

At 1080p, 3% delta from the 3060, 25% delta from the 3060 Ti

At 1440p, 11% delta from the 3060, 16% delta from the 3060 Ti

 
This card should have been priced at 199$ including the taxes, to have any meaningful value.
Otherwise, there is no point of paying a little more for an RTX 4060 who can also have DLSS with Frame Generation.
I mean, is a non brainer....
The A770 was originally a $400 card in its 16GB form. I doubt they would be breaking even at $200.
I wonder. Is Arc's problem:

massive bugs (shades of AMD's claim with RDNA3)
design that scales horribly
woefully inefficient

All 3? And more? The tiny preview for this for me was the NUC6 line 8 years ago. That had the first Iris Plus iGPU in the Skylake i5 NUC (currently my TV server's front end) with 384 cores + 64MB eDRAM at 20W TDP, easily beating the older gen Crystalwell Iris Pro with 320 cores +128MB eDRAM using 47W. So seemingly a good design. But then there was the upmarket stablemate i7 Skylake NUC with 512 cores + 128MB eDRAM, 25% faster DDR4 and a 45W TDP. Which was only 15-20% faster. That is damn poor scaling.

I thought this apparent scaling issue would have been fixed 2 GPU architectures later but apparently not. I hope Battlemage is considerably better.
My hat is split between "arch problem" and "driver problem". The two are linked, IIRC the intel devs have to work around Arc's issues with every driver, hence why drivers are seeing huge improvements per game but not across the board. It's a work intensive way of doing it, hopefully battlemage fixes it.
-TBF the A770 is on 6nm, so a little more compact than the N7 process the Rx6000 dies went with but not the N5/N4 the Rx7xxx and Ada dies are going with. 7600 is N6 as well.

6700XT on 7nm is 237mm2. 6900xt's N21 die was 520mm2.

So really a huge performance failure in raster workloads.
You are absolutely right, it is a last gen product, so it may not be super fair to compare it to a 4060.

It doesnt get much better though. The 237mm2 6700xt is 30% faster at 1080p then the 406mm2 A770 here. The 520mm2 6900xt is 90% faster, nearly twice the intel chip. The 397mm2 3070 is 44% ahead.

We chalked that up to driver issues at launch but I'm seriously thinking something is wrong with the alchemist design fundamentally that is choking performance. On paper this thing should be way faster then it is.

Anyone understand why power consumption is so poor?
The A770 is a 406mm2 6nm card. It's bigger then a RTX 3070 and made on a last gen node. It's power use is gonna look like poo next to ~200mm2 5/4nm cards.
 
That's an optimistic interpretation of the results. Its performance averages closer to the regular 3060 than to the 3060 Ti.

At 1080p, 3% delta from the 3060, 25% delta from the 3060 Ti

At 1440p, 11% delta from the 3060, 16% delta from the 3060 Ti

Try reading that review a little closer. You're missing a few things. Important things, context things.
 
Anyone understand why power consumption is so poor?
For whatever reason the idle clocks are very high on the A770 and they haven't been able to reduce it.
A770's GPU and memory idle at 1000/2187

Compared to 210/100 for the RTX4060

or 0/4 for the RX7600

Try reading that review a little closer. You're missing a few things. Important things, context things.
Such as?
 
Well, what did I say earlier(that you quoted)? With that in mind what do you think you missed? Do you think I pulled what I said out of my backside or do you think there's some context there?
(hint, I'm not going to hold your hand...)
 
Well, what did I say earlier(that you quoted)? With that in mind what do you think you missed? Do you think I pulled what I said out of my backside or do you think there's some context there?
(hint, I'm not going to hold your hand...)
If you mean the price, then you're missing what I'm getting at. You said the A770 competes with the 4060 and the 3060 Ti. The 4060 I think is a fair comparison, which is why I didn't mention that, but the A770 does not compete with the 3060 Ti in either performance nor price. They aren't the same tier of card.
 
but the A770 does not compete with the 3060 Ti in either performance
And that's what you missed. Go back and re-read the review. What I'm getting at is that you are making your statement based on the general averages, whereas what I was saying earlier is based on individual benchmarks, in which many favor the A770 over certain other GPU's. Put another way, it's very dependent on the game.
 
Last edited:
Back
Top