Wednesday, July 19th 2023

ASRock Quietly Releases 16GB Arc A770 Phantom Gaming Graphics Card

Without a formal announcement, or even a product page on its website, ASRock has quietly released a 16 GB version of its Arc A770 Phantom Gaming graphics card. The company's A770 lineup has just the one model—the Phantom Gaming OC, with 8 GB of 16 Gbps GDDR6 memory across the GPU's 256-bit memory interface. The new 16 GB model would be among very few of its kind being released by Intel's board partners, in an attempt to make the A770 an attractive option for those in the market for 1080p thru 1440p capable graphics cards.

Intel's reference designs for the Arc 7 series included the A770 Limited Edition, a card with 16 GB of 17.5 Gbps GDDR6 memory, while the plan originally was for 8 GB of 16 Gbps memory to be the standard memory configuration for the A770. It remains to be seen if the new ASRock Phantom Gaming OC is using 17.5 Gbps memory, or 16 Gbps. The only other 16 GB A770 custom design graphics card available in the West has to be the Acer Predator Arc A770 BiFrost 16 GB. The ASRock A770 16 GB PG was discovered on US retailer Newegg, where it was listed for $329.
Source: VideoCardz
Add your own comment

21 Comments on ASRock Quietly Releases 16GB Arc A770 Phantom Gaming Graphics Card

#2
Ruru
S.T.A.R.S.
Looks pretty fine, also as Arc's performance has got better with newer drivers, that's not a bad card for that price.
MachineLearningI don't like the looks of this card but it should have the best cooler of any A770. Considering it.

The Acer Bifrost A750 is also now on Newegg. It looks slightly different than the A770 Bifrost, I prefer this aesthetic. No clue how the cooler is changed, if at all.
www.newegg.com/acer-dp-z35ww-p01/p/N82E16814553002?Item=N82E16814553002&Source=socialshare&cm_mmc=snc-social-_-sr-_-14-553-002-_-07192023
Isn't the BiFrost pretty meh though it looks hella cool (pun intended) :/
Posted on Reply
#3
deb
MachineLearningI don't like the looks of this card but it should have the best cooler of any A770. Considering it.
I don't disagree with this, but it's in line with the current main trend of video card design, which means broader appeal hopefully.

Very interested to see where Intel continues to go, as always.
Posted on Reply
#4
Solaris17
Super Dainty Moderator
KissamiesIsn't the BiFrost pretty meh though it looks hella cool (pun intended) :/
ehhh yes and no. I think the "general" performance indicator for physical build has been the cooler. The thing is all of the current coolers can handle ARC without an issue. So all of the designs "seem meh" but the real answer is none of them are better than the other....and none of them really need to be.

Your talking maybe a few C on coolers that can easily keep the GPU 10+ C from its max at all times.
Posted on Reply
#5
InVasMani
Nice to see a triple fan ARC 770 16GB card. I'm rather curious about direct storage going forward and if a separate GPU or even iGPU could be used for that purpose. It's makes for some interesting scenario's with GPU's designed intentionally for accelerated storage devices.

Found some info on specs 17.5Gbps memory with 559.9 GB/s bandwidth. Seems it has a base speed of 2200MHz so possibly a boost of 2500MHz or maybe the same as limited edition's 2400MHz either way at least base speed is a little higher so maybe few frame rate dips thanks perhaps to better cooler. I'm speculating somewhere between 2400MHz to 2500MHz boost if the rest of the information is accurate.
videocardz.net/asrock-arc-a770-16gb-phantom-gaming-oc
Posted on Reply
#6
MachineLearning
Solaris17ehhh yes and no. I think the "general" performance indicator for physical build has been the cooler. The thing is all of the current coolers can handle ARC without an issue. So all of the designs "seem meh" but the real answer is none of them are better than the other....and none of them really need to be.

Your talking maybe a few C on coolers that can easily keep the GPU 10+ C from its max at all times.
This is why I think ASRock's A750 Challenger cooler would he a great pairing with A770 16GB.
elchapuzasinformatico.com/2023/01/asrock-intel-arc-a750-challenger-d-8gb-oc-review/
It appears to be very capable, easy to maintain and is often the cheapest model.
Thermal pads on the backplate, completely utilitarian design all around.
Posted on Reply
#7
lemonadesoda
The better the cooler, the quieter the cooler. TO COOL, OR NOT TO COOL, THAT IS THE QUESTION!

The noisy shouting is intentional
Posted on Reply
#8
Assimilator
So I guess the "limited edition" really wasn't all that limited after all. Doesn't really matter though, the A7750 is an 8GB card at best just like the 4060/Ti, adding 8GB more is ultimately pointless.
Posted on Reply
#9
Chrispy_
Cool cool cool.

There was a minute when production of 16GB A770 seemed to have dried up and their price skyrocketed briefly, just as all the crap 8GB models were arriving from AMD and Nvidia.

I'd buy an Arc these days, but not at scalper prices, and not with 8GB. 1440p raytraced titles are likely going to be better on Intel than AMD at this price point, and Nvidia's entry point for 1440p appears to be the $600 4070. Nvidia are effectively not competing.
Posted on Reply
#10
sLowEnd
KissamiesLooks pretty fine, also as Arc's performance has got better with newer drivers, that's not a bad card for that price.
$330 for performance in the ballpark of the 6600XT and RTX 3060 is not good. (With horrible idle power consumption and RTX3070 levels of power consumption under load too.) I wouldn't touch this thing at that price.
Posted on Reply
#11
Chrispy_
AssimilatorSo I guess the "limited edition" really wasn't all that limited after all. Doesn't really matter though, the A7750 is an 8GB card at best just like the 4060/Ti, adding 8GB more is ultimately pointless.
The limited edition was the "made by Intel" aka founders edition.
Posted on Reply
#12
Assimilator
Chrispy_The limited edition was the "made by Intel" aka founders edition.
I know, but "founders edition" encapsulates that much better than "limited edition". Intel marketing really needs to think before they label things.
Posted on Reply
#13
Chrispy_
sLowEnd$330 for performance in the ballpark of the 6600XT and RTX 3060 is not good. (With horrible idle power consumption and RTX3070 levels of power consumption under load too.) I wouldn't touch this thing at that price.
It's not a particularly balanced card; Sometimes it's down with the RX 6600 and 3060, other times (rarely) it's outperforming a 3070 - but performance scales better at higher resolutions and settings and it raytraces like an Nvidia card rather than an AMD card.

It averages 3060Ti ballpark performance with current drivers (don't look at 2022 reviews because the drivers have come so far since then), and if Nvidia are happy to charge $499 for "about 3060Ti performance with 16GB", then $330 for the Intel card is at least competitive. I think I'd rather get a 6700XT for $330 but at least the A770 is a viable alternative (with caveats). More competition is good for us, even if you end up buying AMD or Nvidia instead.
Posted on Reply
#14
sLowEnd
Chrispy_It's not a particularly balanced card; Sometimes it's down with the RX 6600 and 3060, other times (rarely) it's outperforming a 3070 - but performance scales better at higher resolutions and settings and it raytraces like an Nvidia card rather than an AMD card.

It averages 3060Ti ballpark performance with current drivers (don't look at 2022 reviews because the drivers have come so far since then), and if Nvidia are happy to charge $499 for "about 3060Ti performance with 16GB", then $330 for the Intel card is at least competitive. I think I'd rather get a 6700XT for $330 but at least the A770 is a viable alternative (with caveats). More competition is good for us, even if you end up buying AMD or Nvidia instead.
Performance is still in the same ballpark I mentioned even just 2 weeks ago.
Posted on Reply
#15
Chrispy_
Like I said, it's not a particularly balanced card, and the caveats I mentioned are older DX11 games are going to suck, and that even includes "DX12" engines like UE4 that are mostly DX11 with a few sprinkles of DX12. Realistically, UE4 is almost done now with current games in development already on UE5.

You're not buying Arc unless you need it for DX12/Vulcan, and if you don't know about it's DX9/11 emulation and corresponding performance then you shouldn't be touching it with a ten foot pole.

For new DX12 games with raytracing, like 3 of those 4 raytracing tests from AncientGameplays, he shows that it's dramatically faster than the 7600 or 3060, usually matching or surpassing a 3060Ti depending on the title.

In case it's not clear from the previous comments, Arc performance is not balanced. There are caveats.
Posted on Reply
#16
sLowEnd
Chrispy_Like I said, it's not a particularly balanced card, and the caveats I mentioned are older DX11 games are going to suck, and that even includes "DX12" engines like UE4 that are mostly DX11 with a few sprinkles of DX12. Realistically, UE4 is almost done now with current games in development already on UE5.

You're not buying Arc unless you need it for DX12/Vulcan, and if you don't know about it's DX9/11 emulation and corresponding performance then you shouldn't be touching it with a ten foot pole.

For new DX12 games with raytracing, like 3 of those 4 raytracing tests from AncientGameplays, he shows that it's dramatically faster than the 7600 or 3060, usually matching or surpassing a 3060Ti depending on the title.

In case it's not clear from the previous comments, Arc performance is not balanced. There are caveats.
It's not typically around 3060 Ti performance like you claimed earlier. 3060 Ti/6700XT tier performance is markedly better than the A770's typical performance. This thing typically sits around RTX 3060 and 6600XT performance.

Chrispy_It's not a particularly balanced card; Sometimes it's down with the RX 6600 and 3060, other times (rarely) it's outperforming a 3070 - but performance scales better at higher resolutions and settings and it raytraces like an Nvidia card rather than an AMD card.

It averages 3060Ti ballpark performance with current drivers (don't look at 2022 reviews because the drivers have come so far since then), and if Nvidia are happy to charge $499 for "about 3060Ti performance with 16GB", then $330 for the Intel card is at least competitive. I think I'd rather get a 6700XT for $330 but at least the A770 is a viable alternative (with caveats). More competition is good for us, even if you end up buying AMD or Nvidia instead.
As far as the whole competition thing goes, if for whatever reason some person thinks a multi-billion dollar company needs their charity, Intel has a far better value product in the A750, which is available for $240.
www.newegg.com/sparkle-arc-a750-sa750c-8goc/p/N82E16814993002

The A770 is a poor value at $330.
Posted on Reply
#17
Chrispy_
sLowEndIt's not typically around 3060 Ti performance like you claimed earlier. 3060 Ti/6700XT tier performance is markedly better than the A770's typical performance. This thing typically sits around RTX 3060 and 6600XT performance.



As far as the whole competition thing goes, if for whatever reason some person thinks a multi-billion dollar company needs their charity, Intel has a far better value product in the A750, which is available for $240.
www.newegg.com/sparkle-arc-a750-sa750c-8goc/p/N82E16814993002

The A770 is a poor value at $330.
I mean you can cherry-pick your videos, sure. With such an inconsistent card you can basically find any result you want based on which games are used by any given reviewer.

Here's a review retesting recent drivers at 1080 and 1440p, more realistic resolutions for cards in the RX6600 to RTX3070 range of performance....
www.techspot.com/review/2634-intel-arc-a770-a750-revisit

As I've already stated several times now, it's inconsistent and will sometimes be down fighting the vanillla RX6600 and sometimes outperforming a 3060Ti. Resolution seems to play a big part of how well Arc performs, too. How well it performs "on average" depends entirely on which games you're averaging. Most reviews with a mix of games do put it squarely in between a 6600XT and 3060Ti.

The 8GB A750 is compelling at $240, but for how much longer? I think $330 for a half-decent GPU with more than 8GB in today's market is decent value in the long run. Sure, 8GB competition may be more effective right now, but that's been the hot topic of 2023, right? 8GB isn't enough any more and certainly won't be enough for future titles.

If you want a cheaper card these days, get a 3060 12GB IMO. If you want a faster card, get a 6700XT. I'm finding it really hard to justify spending more than $250 on anything new with less than 12GB VRAM.
Posted on Reply
#18
sLowEnd
Chrispy_I mean you can cherry-pick your videos, sure. With such an inconsistent card you can basically find any result you want based on which games are used by any given reviewer.

Here's a review retesting recent drivers at 1080 and 1440p, more realistic resolutions for cards in the RX6600 to RTX3070 range of performance....
www.techspot.com/review/2634-intel-arc-a770-a750-revisit

As I've already stated several times now, it's inconsistent and will sometimes be down fighting the vanillla RX6600 and sometimes outperforming a 3060Ti. Resolution seems to play a big part of how well Arc performs, too. How well it performs "on average" depends entirely on which games you're averaging. Most reviews with a mix of games do put it squarely in between a 6600XT and 3060Ti.

The 8GB A750 is compelling at $240, but for how much longer? I think $330 for a half-decent GPU with more than 8GB in today's market is decent value in the long run. Sure, 8GB competition may be more effective right now, but that's been the hot topic of 2023, right? 8GB isn't enough any more and certainly won't be enough for future titles.

If you want a cheaper card these days, get a 3060 12GB IMO. If you want a faster card, get a 6700XT. I'm finding it really hard to justify spending more than $250 on anything new with less than 12GB VRAM.
Okay, another one. It's well established that the 4060 is not as fast as a 3060Ti. Care to guess how the A770 fares against it?

Edit: The review you linked literally affirms that the A770 sits around the 3060 and 6600XT, with cards like the 3060Ti and 6700XT being a good deal faster. Even at 1440p, the performance is closer to the 3060 (+9FPS) and 6600XT (+7FPS) than it is to the 3060Ti (-13FPS)
Edit2: images and additional comment added
Posted on Reply
#19
Chrispy_
yeah, the more reviews I look at the more it's closer to a 3060 than 3060Ti.
It's definitely better than the 3060 in every comparison you look at though, and the 3060Ti is going to suck in 2024 with 8GB VRAM, if you're not already looking at games and settings where it sucks today.
Like I said earlier, at $330 the 6700XT is my pick but competition is good for us and I genuinely don't want to give any company >$250 for an 8GB product ever again.
sLowEndAs far as the whole competition thing goes, if for whatever reason some person thinks a multi-billion dollar company needs their charity
I think you're getting the wrong end of the stick here. Wanting competition is not charity. Competition is good for consumers and BAD for manufacturers. What manufacturers want is NO competition so they can charge as much as they want, uncontested. Buying from Intel isn't charity, it's supporting competition. Don't buy any GPU unless you're happy with what you're getting. The companies don't give a damn about you, but by supporting competition you are contributing the tiniest little bit to better GPU pricing in the future. That shouldn't be any part of your purchase decision, it's just a statement of fact that supporting the underdogs will contribute to a competitive market that favours buyers over the manufacturers.
Posted on Reply
#20
sLowEnd
Chrispy_I think you're getting the wrong end of the stick here. Wanting competition is not charity. Competition is good for consumers and BAD for manufacturers. What manufacturers want is NO competition so they can charge as much as they want, uncontested. Buying from Intel isn't charity, it's supporting competition. Don't buy any GPU unless you're happy with what you're getting. The companies don't give a damn about you, but by supporting competition you are contributing the tiniest little bit to better GPU pricing in the future. That shouldn't be any part of your purchase decision, it's just a statement of fact that supporting the underdogs will contribute to a competitive market that favours buyers over the manufacturers.
Lmao no. No company goes into a market as mature as graphics cards with the idea that they're going to give up if their first gen doesn't sell like hotcakes. Intel, if they're serious about being a player in this part of the PC market, would have had plans to continue regardless of how their first gen does. It's an insanity to think that all the stars would align with a first gen product, and I'm pretty sure Intel isn't insane. The first gen would be a learning experience. If it doesn't sell well, Intel ought to learn why it didn't sell well.
Posted on Reply
#21
Chrispy_
sLowEndLmao no. No company goes into a market as mature as graphics cards with the idea that they're going to give up if their first gen doesn't sell like hotcakes. Intel, if they're serious about being a player in this part of the PC market, would have had plans to continue regardless of how their first gen does. It's an insanity to think that all the stars would align with a first gen product, and I'm pretty sure Intel isn't insane. The first gen would be a learning experience. If it doesn't sell well, Intel ought to learn why it didn't sell well.
You realise this is Intel, who quit the GPU market twice before - and who have also quit from the following markets where they had competed successfully and with significant market-share:

NAND, sold their client SSD business to Solidigm and quit
Optane, just quit outright - you can't buy it as a consumer any more.
Mobile data chips for LTE/4G/5G etc - sold to Mediatek and quit
NUC, Intel quit this week; edit - it seems like they're passing all of their pantents and IP to ASUS according to yesterday's GN weekly update

Intel dGPUs are a side-effect of their CPUs needing to get proper graphics capabilities or fall behind, so they'll always have the technology to make a dGPU, but don't for one minute think Intel will keep hammering away at the consumer GPU market if they're not successful. Their track record suggests they have low staying power...
Posted on Reply
Add your own comment
Nov 21st, 2024 10:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts