Friday, September 30th 2022

ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750
The Intel Arc 7-series performance-segment graphics cards announced earlier this week, are all priced within $60 of each other. The series begins with the Arc A750 at USD $289. $40 more gets you the Arc A770 Limited Edition 8 GB, at $329. The top-of-the-line Arc A770 Limited Edition 16 GB is priced just $20 higher, at $349. This puts the Intel flagship at least $30 less than the cheapest NVIDIA GeForce RTX 3060 available in the market right now, which can be had for $380. The dark horse here is the AMD Radeon RX 6650 XT, which is going for as low as $320.
Intel extensively compared the A770 to the RTX 3060 in its marketing materials, focusing on how its ray tracing performance is superior to even that of NVIDIA RTX in this segment, and that the Intel XeSS performance enhancement is technologically on-par with 2nd generation super-scaling techs such as FSR 2.0 and DLSS 2. If Intel's performance claims hold, the A770 has the potential to beat both the RTX 3060 and RX 6650 XT in its segment. The Arc A750, A770 8 GB, and A770 16 GB, go on sale from October 12. Stay tuned for our reviews.
Intel extensively compared the A770 to the RTX 3060 in its marketing materials, focusing on how its ray tracing performance is superior to even that of NVIDIA RTX in this segment, and that the Intel XeSS performance enhancement is technologically on-par with 2nd generation super-scaling techs such as FSR 2.0 and DLSS 2. If Intel's performance claims hold, the A770 has the potential to beat both the RTX 3060 and RX 6650 XT in its segment. The Arc A750, A770 8 GB, and A770 16 GB, go on sale from October 12. Stay tuned for our reviews.
41 Comments on ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750
AdoredTV is a clown, and you only confirmed it, the reason he exists is because of the cult following feeding his ad revenue ;)
AMD still hasn't got feature parity with Nvidia, you can twist that however you like, but 'cheaper and faster' is not the criterium that confirms anything. Its just one of many. AMD had inferior product that they sold at lower price. But inferior, it has been for a long time and it really still is on the feature aspect. Historically we also know that stuff needs to 'just work' for the vast majority, the niche that likes to tweak is just that, a niche, and Nvidia caters to the former group best. Still does.
Ryzen is the practical proof of what I'm saying. It took several generations of 'beating' the competition to catch Intel and actually gain market share. Platform needed to mature, arch needed refinements. Yes, brand recognition exists and its only logical, Rome isn't built in a day, and neither is trust that stuff just works. And how is AMD doing on that front today, wrt pricing? Saving us? It once again confirms that everything is more of the same: AMD prices its product to the maximum of what the market can/might bear. So why did they price lower on GPUs back in the day? Let's see.... 1+1= ? You're saying that isn't true as AMD 'was better and cheaper' at times. That is some real mental gymnastics!
Now don't get me wrong, I"m absolutely NOT pro Nvidia, I hate their guts since Turing, haven't liked a single thing they released since then, except DLSS, but even so I prefer AMD's agnostic FSR implementation to much greater degree. So I'm 'with AMD' on their approach to less proprietary bullshit. I never once considered spending even a dime on Gsync, but have a Freesync monitor on an Nvidia card. But we have to keep calling things what they are. Misplaced favoritism is just that, misplaced. RDNA2 is technically impressive, power efficient, a good architecture. But: it misses features the competition does have. If they add those in RDNA3 and have reasonable price/perf to compete, its insta-buy. But at the same time, I say what I've said up here about Nvidia vs AMD and the battle they fight. AMD has been cheapskating it and it shows, the moment they got serious, they have competitive product again. Go figure...
As for Intel cutting into AMD's share, you might be right about that for all the reasons mentioned here. Intel similarly has an offering that isn't 'just working' everywhere anytime no matter what game you play. Ironically the DX11 performance takes a back seat just like it did for AMD. Feature parity isn't there because half the features don't even work. Trust is zero, or sub zero. But let's take a long look at Nvidia, too, that's definitely not all sunshine and rainbows either lately, most notably on power usage - but that is actually new, and not the usual MO for them. AMD has a real opportunity here to capture share, if RDNA3~5 are great.
I'm not looking to buy a GPU at the moment but if I was I'll be really tempted just to satisfy my curiosity. They'll probably perform exactly the same and with the highest price being $350 there is not much room to price it right.
Most people for 16GB version will gladly play those extra $20 with or without effect on performance so once those get sold out people will buy buy the 8GB version and still not feel bad because it does perform the same in real life. So actually it priced perfectly. :D
It's definitely a good idea but game devs become so good at faking lighting that Raytracing doesn't have the same effect as other technologies in the past likeDX9 with the realistic water in Farcry and HL.
Plus not supported in all games. DLSS same thing works better but no supported in all games.
All companies are the same they are here to maximize their profit from any product.
Whoever has the fastest CPU/GPU has the right to charge more for the whole line up. Not something I agree with but the biggest part of the customer bas they'll just read AMD released the fastest CPU which of course they can't afford but in their mind that means that at any price range any AMD will be better then Intel. Same for the GPUs. I see what you did there. :D
Recently I bought my wife a woofer it was a surprise and the other day she surprised me (without her knowledge) with a SteamDeck for my upcoming birthday.
The secret is knowing how to put the right spin to it. :D
Maybe with some extra-mega-super-giga texture packs, other than that on ultra or high texture you are fine with 6-8GB in 2022. The two games i play ATM, none of them uses above 8GB of vRAM. So yes, it will be another 10 years before 8GB of vRAM is a minimum system requirement.
Hopefully they will be successful and we can get more competition.
Look at 1080p which has been around for a while now. 4gb of VRAM was enough for games made in 2010 and it's still enough for games made in 2022.
Obviously higher resolutions will have a higher floor for VRAM use and 4gb is not enough RAM for higher resolutions.
The floor wont change, but the peak VRAM use never moved past 4gb for 1080p and so why would it go higher for 1440p or 4k?
What a missed opportunity by nvidia not releasing something new and competitive in this segment (under 400$).
1500$+ card not gonna sell well + crypto not good atm
Performance is another issue. Scream fine wine all you want, people want the performance they paid for now, not 5 years from now.
You could also get into features like shadow play, which took AMD years to copy, or NVENC encoding, which AMD still has no answer for.
Then all those who talk of "ngreedia mindshare" conveniently forget that evergreen sold real good. The hd 5000s hit 49% market share. Then AMD left their product to rot while they dumped money into the failure that was bulldozer, and was caught completely off guard when Nvidia actually fixed fermi instead of re releasing it. AMD would do the same thing with Hawaii (to waste cash on seamicro).
I could go on. (Hey remember how AMD launched Ryzen mobile then abandoned driver development to OEMs?)
AMDs biggest opponent is not Nvidia. It's AMD. Their rocky launches and the copium their community produces do not translate to sales. Strong launches with solid drivers and strong product software make sales.
- Shadowplay
- DLSS
- Hairworks (lol! more of a joke than anything else, but hey, AMD had to follow with TressFX)
- TXAA / temporal AA
- FXAA
- RT
- Gsync
- etc.
The gist of it is, Nvidia was exercising thought and innovative leadership in the gaming segment, and AMD was not. They are now building on that, I hope they keep momentum. But Nvidia hasn't stopped trying to lead. And these are no small things - many pushes Nvidia initiated have been pretty neat ones that brought gaming graphics further.
"Stability" my ass, the past 2 years I've had more issues with Geforces than AMDs. nVidias HDMI 2.1 implementation on TVs is a joke.
And as somone that had pretty much 1 Radeon of every gen (plus Geforces) since the x000 series (before HD 2000 series) I haven't had issues gaming with Radeons since AMD bought them. And yes fine wine does exist, while I've experienced the contrary with nVidias (old cards getting lower performance with newer drives after some years).
All features you mentioned only affect streamers, not vanilla gamers.
"I could go on" you can't, there's no killer feature that makes Geforces better for gamers. There was 1, DLSS which is great, but AMD and FSR 2.0 open source just killed it like GSync was killed by VRR.
Guy's post on AdoredTVs is correct. I have lots of friends that won't even consider an AMD card for their builds, they won't even check benchmarks. And they are IT guys, not stupid 12 year olds.
Ryzen took 2-3 generations to start getting mindshare, and was helped by the (justified) hatred people have at Intel for their shady practices and forever 4 core CPUs. nVidia doesn't sit on their hands like that, so there's no vacuum for AMD or Intel to fill.
All in all, to me this is actually advantageous. People flock to buy nVidia so, usually, I can get AMDs for a lower price. But these 2 are neck n neck on performance/quality of drivers, and any drastic comment on that is being a fan boy.