• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sparkle Arc A770 ROC

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,988 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The Sparkle Arc A770 ROC is a custom design variant of the Intel Arc A770. It comes with a capable cooler design that's still compact enough to fit into all cases. How does Intel's discrete GPU do in late 2024? Is it a good alternative to RTX 4060 and RX 7600? Read on to find out.

Show full review
 
Joined
Nov 27, 2023
Messages
2,619 (6.43/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Still too driver and game dependent, inconsistent and still could stand to shave off 20-30 bucks off the price. But I will give Intel this - they are miles ahead of the shitshow these cards were at launch, so good on them. Fingers crossed that Battlemage actually manages to be a contender at least for the entry to mid-range segment.

Oh, and:
“While both RTX 4070 and RX 7600 have "only" 8 GB VRAM, our benchmark results take that fact into account and the A770 with 16 GB cannot offer a meaningful advantage (at 1080p Full HD).”

I assume 4060 is what’s meant there in the Conclusion.
 
Joined
Jan 20, 2019
Messages
1,604 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
so, ~double the power consumption over the 4060/7600, delivers inferior performance and that too while NOISIER. At the very least, Intel could have lowered the price to make it a more attractive option.

Do we really need 16GB at this performance level? Perhaps some workloads justify? For gaming it seemed pointless when AMD/NVIDIA dropped their 16GB entry level cards and now Intel. At 4K the performance is impacted positively over 8/12GB cards but over-all performance at higher resolutions SUCK!

On the bright side, more entry-level cards are always a welcome addition. Over time, this should help position them where they truly belong... in the ~$200 price range.

Let’s hope these less-than-stellar efforts don’t discourage Intel from continuing its uphill battle(mage) to smash into and thrive in the GPU market. We need more compo to challenge the top dogs (or Dog)
 
Last edited:
Joined
Apr 8, 2010
Messages
1,012 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Interesting timing to review this with battle mage (supposedly) on the horizon
 
Joined
Mar 14, 2014
Messages
1,442 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
The Witcher 3 is is still one of the best tells of what a card is capable of.
That test needs to stay in there as long as GPUs are still tested. It scales better than any other game and it even has near perfect SLI scaling.
 
Joined
Aug 11, 2020
Messages
28 (0.02/day)
That DLSS remark as a con even with the qualifier is not fair unless you remark on every nvidia and amd review their lack of XeSS and or FSR (which of course work on other cards). DLSS is specifically an nvidia tech. You cant blame and non nvidia card for not having DLSS.
 
Last edited:
Joined
Jul 20, 2020
Messages
1,165 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
The Witcher 3 is is still one of the best tells of what a card is capable of.
That test needs to stay in there as long as GPUs are still tested. It scales better than any other game and it even has near perfect SLI scaling.

An old DX11 game where this GPU just happens to perform 33% better than its regular low FPS? Sure, it may suggest how a GPU behaves in older games though maybe just one older game... but it's not indicative of anything current in this case other than how to optimize a GPU for a single game.
 
Joined
Nov 22, 2023
Messages
256 (0.62/day)
so, ~double the power consumption over the 4060/7600, delivers inferior performance and that too while NOISIER. At the very least, Intel could have lowered the price to make it a more attractive option.

Do we really need 16GB at this performance level? Perhaps some workloads justify? For gaming it seemed pointless when AMD/NVIDIA dropped their 16GB entry level cards and now Intel. At 4K the performance is impacted positively over 8/12GB cards but over-all performance at higher resolutions SUCK!

On the bright side, more entry-level cards are always a welcome addition. Over time, this should help position them where they truly belong... in the ~$200 price range.

Let’s hope these less-than-stellar efforts don’t discourage Intel from continuing its uphill battle(mage) to smash into and thrive in the GPU market. We need more compo to challenge the top dogs (or Dog)

- At 400mm2 die, its obvious that the performance level was supposed to be closer to 6800/3070, where 16GB of RAM would have made more sense.

At this point I have to assume the 16Gb is to snag low info folks who think more RAM at this price point means the card is better.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,988 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
“While both RTX 4070 and RX 7600 have "only" 8 GB VRAM, our benchmark results take that fact into account and the A770 with 16 GB cannot offer a meaningful advantage (at 1080p Full HD).”
Yeah, typo, fixed

That test needs to stay in there as long as GPUs are still tested
Confirmed, it's definitely staying for the 2025.1 Test System :)

That DLSS remark as a con even with the qualifier is not fair unless you remark on every nvidia and amd review their lack of XeSS and or FSR (which of course work on other cards). DLSS is specifically an nvidia tech. You cant blame and non nvidia card for not having DLSS.
DLSS is the best upscaler, especially FG is a huge selling point. While I can't blame others for not having "DLSS", I can blame them for not having a better upscaler/framegen than DLSS
 
Joined
Aug 11, 2020
Messages
28 (0.02/day)
Yeah, typo, fixed


Confirmed, it's definitely staying for the 2025.1 Test System :)


DLSS is the best upscaler, especially FG is a huge selling point. While I can't blame others for not having "DLSS", I can blame them for not having a better upscaler/framegen than DLSS
I mean in the end you can write what you like. I don't agree with it but I get it.
 
Joined
Feb 28, 2024
Messages
73 (0.23/day)
I really think they should play the long game, stop chasing this-quarter profits and cut the price. It's really hard to get into the GPU market, AMD has been at it for YEARS and has only a few % of the market. I'd love to see more competition, but to get in from basically scratch, Intel needs to accept that it won't be profitable right away and sell at close to cost to build market share, until they're competitive with Nvidia and AMD, which I believe will take a few more R&D cycles. Until then, the only thing that have to compete on is price.
 
Joined
Jan 9, 2023
Messages
338 (0.46/day)
Considering the 7600XT is coming very close in price now this thing is just... Not it man.
Sparkle also messed up a lot with this card. Slamming the fanspeed to 1200RPM when the card hits 56°C? What year is it? 2015?
And good lord the aggressive fanprofile, I thought Gigabyte had some stinkers in the 7000 series, get a load of Sparkle.
Or not, nobody has this thing and the ones that actually bought it are too embarrassed to talk about it or rigged their own solution.
This thing sucks. Sparkle sucks and the A770 sucks. Do better.
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
I really think they should play the long game, stop chasing this-quarter profits and cut the price. It's really hard to get into the GPU market, AMD has been at it for YEARS and has only a few % of the market. I'd love to see more competition, but to get in from basically scratch, Intel needs to accept that it won't be profitable right away and sell at close to cost to build market share, until they're competitive with Nvidia and AMD, which I believe will take a few more R&D cycles. Until then, the only thing that have to compete on is price.

They are playing the long game.

Their main issue is the drivers, specifically that AMD and Nvidia drivers pretty much have optimizations for every game ever. Another aspect is that Nvidia and AMD have a crew of driver coders with vast experience, while Intel is building that out from mostly scratch.

The hardware here should be competitive with the 4070 and 6700 XT. There are only a few games where this really shows, but it is a feat that never happens with for example a 6600 XT or 3060 Ti.

What this review mostly shows is where they are in that driver and driver team build-out. Looks like at least a couple more years to go.
 
Joined
Mar 14, 2014
Messages
1,442 (0.36/day)
Processor 11900K
Motherboard ASRock Z590 OC Formula
Cooling Noctua NH-D15 using 2x140mm 3000RPM industrial Noctuas
Memory G. Skill Trident Z 2x16GB 3600MHz
Video Card(s) eVGA RTX 3090 FTW3
Storage 2TB Crucial P5 Plus
Display(s) 1st: LG GR83Q-B 1440p 27in 240Hz / 2nd: Lenovo y27g 1080p 27in 144Hz
Case Lian Li Lancool MESH II RGB (I removed the RGB)
Audio Device(s) AKG Q701's w/ O2+ODAC (Sounds a little bright)
Power Supply Seasonic Prime 850 TX
Mouse Glorious Model D
Keyboard Glorious MMK2 65% Lynx MX switches
Software Win10 Pro
The greatest game ever made you mean? Just so happens to be one of the best optimized titles ever? Cyberpunk 2077 scales just as well, it's just not the greatest game ever made.
Absolute impeccable scaling on modern GPUs from an older title is a great test on the history of GPU advancement as well.
The 33% statement just reiterates the greatness of TW3 btw and the failure of modern games optimization and cements it's importance as as test. It's an academic choice not a promotional one.
look at Cyberpunk RT if you want modern example.
An old DX11 game where this GPU just happens to perform 33% better than its regular low FPS? Sure, it may suggest how a GPU behaves in older games though maybe just one older game... but it's not indicative of anything current in this case other than how to optimize a GPU for a single game.
 
Joined
Apr 9, 2021
Messages
71 (0.05/day)
System Name desktop
Processor Ryzen 9800X3d
Motherboard Asrock Pro Rs
Cooling Noctua NH-D15
Memory G.Skill Flare X5 6000mhz cl30 1.35v
Video Card(s) Asus GeForce RTX 4080 TUF Gaming - OC Edition
Storage 2TB WD_BLACK SN850X NVMe
Display(s) Acer Predator 1440p
Case Fractal R6
Some cheap low profile card would be nice, but not this.
 
Joined
Jul 20, 2020
Messages
1,165 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
The greatest game ever made you mean? Just so happens to be one of the best optimized titles ever? Cyberpunk 2077 scales just as well, it's just not the greatest game ever made.
Absolute impeccable scaling on modern GPUs from an older title is a great test on the history of GPU advancement as well.
The 33% statement just reiterates the greatness of TW3 btw and the failure of modern games optimization and cements it's importance as as test. It's an academic choice not a promotional one.
look at Cyberpunk RT if you want modern example.

The A770's performance in Cyberpunk is at least closer to it's average performance, only about 8-9% above it's overall average. That's a more reasonable game to compare to for a current performance evaluation as it uses DX12 and Intel is likely to optimize for new games like Nvidia and AMD do.

TW3 being an excellent and long-popular title means it got the gold optimizing treatment from Intel but for that same reason TW3 is a poor guideline for ARC performance in older games, as there are many where Intel will do no optimization at all and some where they'll do... enough to give a decent framerate.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
27,168 (3.84/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) 49" Philips Evnia OLED (49M2C8900)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Gunnr
Power Supply Seasonic Prime TX-1600
Mouse Razer Viper mini signature edition (mercury white)
Keyboard Monsgeek M3 Lavender, Moondrop Luna lights
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
Could you please pass along a message for them to continue the Luna roc line? It looks the best.
 
Joined
Dec 28, 2012
Messages
4,020 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
- At 400mm2 die, its obvious that the performance level was supposed to be closer to 6800/3070, where 16GB of RAM would have made more sense.

At this point I have to assume the 16Gb is to snag low info folks who think more RAM at this price point means the card is better.
For reference if anyone is interested: The RTX 4080 is 379mm2. The 4060 is 159mm2. The 7600 is 204mm2

the A770 is 400mm2 and somehow slower then the 4060. Something is either clearly wrong with the Arc design or its drivers, or both.

Still too driver and game dependent, inconsistent and still could stand to shave off 20-30 bucks off the price. But I will give Intel this - they are miles ahead of the shitshow these cards were at launch, so good on them. Fingers crossed that Battlemage actually manages to be a contender at least for the entry to mid-range segment.

Oh, and:
“While both RTX 4070 and RX 7600 have "only" 8 GB VRAM, our benchmark results take that fact into account and the A770 with 16 GB cannot offer a meaningful advantage (at 1080p Full HD).”

I assume 4060 is what’s meant there in the Conclusion.
I'll give them credit for improvement, but ding them because, frankly, they have the resources to fix this. For Arc still having this much trouble competing years after release is just pathetic from a company that has a larger R+S budget 3x higher then AMD.
 
Joined
Jul 20, 2020
Messages
1,165 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
I'll give them credit for improvement, but ding them because, frankly, they have the resources to fix this. For Arc still having this much trouble competing years after release is just pathetic from a company that has a larger R+S budget 3x higher then AMD.

I wonder. Is Arc's problem:

massive bugs (shades of AMD's claim with RDNA3)
design that scales horribly
woefully inefficient

All 3? And more? The tiny preview for this for me was the NUC6 line 8 years ago. That had the first Iris Plus iGPU in the Skylake i5 NUC (currently my TV server's front end) with 384 cores + 64MB eDRAM at 20W TDP, easily beating the older gen Crystalwell Iris Pro with 320 cores +128MB eDRAM using 47W. So seemingly a good design. But then there was the upmarket stablemate i7 Skylake NUC with 512 cores + 128MB eDRAM, 25% faster DDR4 and a 45W TDP. Which was only 15-20% faster. That is damn poor scaling.

I thought this apparent scaling issue would have been fixed 2 GPU architectures later but apparently not. I hope Battlemage is considerably better.
 
Joined
Nov 22, 2023
Messages
256 (0.62/day)
For reference if anyone is interested: The RTX 4080 is 379mm2. The 4060 is 159mm2. The 7600 is 204mm2

the A770 is 400mm2 a
nd somehow slower then the 4060. Something is either clearly wrong with the Arc design or its drivers, or both.


I'll give them credit for improvement, but ding them because, frankly, they have the resources to fix this. For Arc still having this much trouble competing years after release is just pathetic from a company that has a larger R+S budget 3x higher then AMD.

-TBF the A770 is on 6nm, so a little more compact than the N7 process the Rx6000 dies went with but not the N5/N4 the Rx7xxx and Ada dies are going with. 7600 is N6 as well.

6700XT on 7nm is 237mm2. 6900xt's N21 die was 520mm2.

So really a huge performance failure in raster workloads.
 
Joined
Jul 5, 2013
Messages
28,450 (6.77/day)
Lets be fair, this A770 is competing with the 3060ti and 4060 both of which are either $280 or more. So it's it's not an unfair price, but yes, the other cards are a much better value at their $230 price. They are THE value cards right now.
 

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
13,122 (2.98/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / console
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.5TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
Luckily I didn't get an A770 2 years ago when I was buying a ~500EUR GPU, ended up with a RX 6700 XT and that seems to beat this even 2 years later.
 
Joined
Jun 6, 2020
Messages
238 (0.14/day)
Lets be fair, this A770 is competing with the 3060ti and 4060 both of which are either $280 or more. So it's it's not an unfair price, but yes, the other cards are a much better value at their $230 price. They are THE value cards right now.
Why shouldn't this compete against the other a770's though? What makes this one $50 more special than the other two?
 
Top