Friday, July 15th 2022

Intel Previews Arc A750 Graphics Card Performance

Intel has decided to share some more details on its upcoming Arc A750 graphics card, the one and same that appeared briefly in a Gamer Nexus video just the other day. The exact product being previewed is the Intel Arc A750 Limited Edition graphics card, but the company didn't reveal any specifications of the card in the video it posted. What is revealed, is that the card will outperform a NVIDIA GeForce RTX 3060 card at 1440p in the five titles that Intel provided performance indications and average frame rates for. The five games are F1 2021, Cyberpunk 2077, Control, Borderlands 3 and Fortnite, so in other words, mostly quite demanding games with F1 2021 and Fortnite being the exceptions.

The only game we get any kind of insight into the actual performance of in the video, is Cyberpunk 2077, where Ryan Shrout details the game settings and the actual frame rate. At 2560 x 1440, using high settings, the Arc A750 delivers 60.79 FPS, with a low of 50.54 FPS and a max of 77.92 FPS. Intel claims this is 1.17 times the performance of an EVGA GeForce RTX 3060 XC Gaming 12G graphics card. At least Intel didn't try to pull a fast one, as the company provided average frame rates for all the other games tested as well, not just how many times faster the Intel card was and you can see those results below. The test system consisted of an Intel Core i9-12900K fitted to an ASUS ROG Maximus Z690 Hero board, 32 GB of 4800 MHz DDR5 memory and a Corsair MP600 Pro XT 4 TB NVMe SSD, as well as Windows 11 Pro. According to the video, the Arc graphics cards should launch "this summer" and Intel will be releasing more details between now and the launch.
Source: Intel
Add your own comment

99 Comments on Intel Previews Arc A750 Graphics Card Performance

#54
PapaTaipei
Probably around 450 euros to be a beta tester
Posted on Reply
#55
efikkan
oxrufiioxoI think people have a hard time remembering because prior to RDNA2 AMD struggled to compete in the high end post 2013... They made a bad bet on HBM crippling the Fury X which might have ended up a decent product had it shipped with 8GB and after that they gave up trying to compete in the high end till rdna2 shipped.
4 GB VRAM wasn't the biggest problem at the time, Nvidia's counterparts 980 and 980 Ti had 4/6 GB. AMD's biggest problem was availability, these were virtually nowhere to be found for a long time. Heat and power management was also a problem on these.
Count von Schwalbe2 points nobody has mentioned - if these fail hard, they are going to be worth $$$ some years in the future.
Only if Intel has already stopped the production of chips and only a few hundred thousand will be made, but I suspect it's too late for that.
What's more likely is that these will be sold in volumes to east-Asian markets and OEM markets for a lower price.
Count von SchwalbeAlso, concerns about driver quality are a double-edged sword. Sure, they might not be great, but that also means that the performance is only going to increase from here.
You fall into the trap of thinking that driver optimizations can just compensate for any shortcoming, but it can't.
The underlying problem for this architecture is clearly showcased when synthetic benchmarks scales much better than real games, that tells us the problem is scheduling inside the GPU, which is not done by the driver at all. This happens because synthetics typically are designed to be more computationally intensive while games are designed to render efficiently.

Intel's problem is essentially very similar to the problem that AMD has had for years, just a fair bit worse. If you remember back when RX 400/500 were competing with Pascal, AMD typically needed 40-50% more Flops, more memory bandwidth, etc. to produce the same performance as Nvidia. This lead many AMD fans to claim that these GPUs had a huge potential to unlock with driver optimizations, thinking there would be 30-40++% more to gain here. Even some reviewers claimed that these cards would perform better over time. So did RX 400/500 ever get that ~30-40% performance gain over Nvidia's counterparts? No, because it was never a caused by driver overhead.

While Intel can certainly fix their driver bugs, and perhaps squeeze out a few percent here and there if they find some overhead, but the overall performance level compared to competing products will remain roughly the same. Think about it, to go from a RTX 3060 performance level to RTX 3060 Ti is about ~30% and RTX 3070 ~45%, there is no way removal of driver overhead is going to yield this amount of performance gains. And never mind competing with Nvidia's high-end products which has approx. twice the performance.
Lastly, there is not even a guarantee that performance is going to improve in all cases. There is a very real chance that some cases are going to get slightly worse with bugfixes, especially if they end up doing quickfixes to eliminate bugs.
Posted on Reply
#56
DemonicRyzen666
Solaris17Yeah exactly. I will buy one, because this is techpowerup an enthusiast forum. I am curious about technology and how this will age, and about the "age" in general as Intel coming in as a dGPU competitor in general, not like will their dGPUs compete. Lots of cynics in the thread. Maybe im too much of a dreamer chasing my sense of adventure, my life was way more fun when I didn't treat hardware purchases like betting on dogs at a race track. Can't wait to "throw my money away" on the experience.
I don't agree with techpower place being an enthiuist forum, it's been more gamer focused for a while.

The big hole in the low end graphics market is a nice spot for intel to aim for, but they're going to edging into the mid range on a first try.
Amd bulldozer was fun to mess with to; see what you could with it and what it couldn't do.
Posted on Reply
#57
evernessince
Bomby569I'm not that pessimistic as you guys, AMD took ages to get a competing gpu past the middle range. Intel done a good job at first try, now bring pricing down and improve drivers. Don't give up and leave us in the hand the other 2 greedy bastards
If you ignore the 35 years AMD did have competitive high end GPUs.

After the 7000 series AMD moved focus from high-end GPUs to heterogeneous computing. It wasn't until Lisa Sue that AMD moved focus back to high end GPUs. To say 'look how long it took AMD to become competitive again' is misleading because it ignores the fact that AMD intentionally moved it's focus away from high end GPUs for awhile in addition to them almost going bankrupt.

Intel doesn't have either of those two factors in play. In reality, when Lisa Sue did come in, RDNA 1 released relatively shortly after proving that in fact when AMD does set out to create a high end GPU it can. RDNA 1, of which was competitive with Turing on an architectural level. It didn't release into the high end due to a lack of resources on AMD's part. Again, not a restriction Intel has.

Intel's GPU engineers are former AMD and Nvidia employees. With Intel's resources at their disposal they have all that's needed to make a good GPU uArch 1st generation. Let's not make excuses for Intel.
Posted on Reply
#58
Bomby569
evernessinceIf you ignore the 35 years AMD did have competitive high end GPUs.

After the 7000 series AMD moved focus from high-end GPUs to heterogeneous computing. It wasn't until Lisa Sue that AMD moved focus back to high end GPUs. To say 'look how long it took AMD to become competitive again' is misleading because it ignores the fact that AMD intentionally moved it's focus away from high end GPUs for awhile in addition to them almost going bankrupt.

Intel doesn't have either of those two factors in play. In reality, when Lisa Sue did come in, RDNA 1 released relatively shortly after proving that in fact when AMD does set out to create a high end GPU it can. RDNA 1, of which was competitive with Turing on an architectural level. It didn't release into the high end due to a lack of resources on AMD's part. Again, not a restriction Intel has.

Intel's GPU engineers are former AMD and Nvidia employees. With Intel's resources at their disposal they have all that's needed to make a good GPU uArch 1st generation. Let's not make excuses for Intel.
so exactly how long did AMD not have a competing gpu on high end? in years? considering the pace in tech terms i would definitely say it was ages. Things were so bad the future was so dark they almost got bankrupt.

"intentionally", give me a break. They intentionally made Vega suck after hyping it
Posted on Reply
#59
ravenhold
Based on the architecture how many raytracing Xe cores will A750 have compared to Nvidia? What is the difference in RT cores Intel/Nvidia?
Posted on Reply
#60
aQi
Pre mature drivers and such claim means we still can expect better. Speaking rationally ARC gfx still needs atleast 11th gen up and resizable bar to work sweetly. What about other systems ?
People still have some kikaas rigs out there.
Posted on Reply
#61
Vayra86
Poor Ryan Shrout, he looks like there is a firing squad ready to pull the trigger behind the camera. Or like one of those hostage videos... did I see him signalling 'Save me!' with his hands just now?
Posted on Reply
#62
TheoneandonlyMrK
I might buy one if that price is right, not too much and not too little performance.
Posted on Reply
#63
Jism
evernessinceIntel's GPU engineers are former AMD and Nvidia employees. With Intel's resources at their disposal they have all that's needed to make a good GPU uArch 1st generation. Let's not make excuses for Intel.
But they are'nt making GPU's in particular to game. They are designing compute based tiles or GPU's / whatever you wanna call it, and spin-off the GPU's that did'nt meet Quality Guidelines for graphics.

What you get is a simular approach as AMD Vega, good at compute, not so good at rendering graphics. The RX400/500 is the same. It's raja all over again.

The problem is they need 50% more power (as of right now) in order to compete with AMD or Nvidia cards.

Nvidia and AMD on the other hand are using full graphics based chipset that are designed for gaming. That explains why both are so efficient compared to Intel. For some generations, intel woud'nt be no threat. But they are marketing it very clever, by using Esports and what more.

Obviously from a marketing standpoint creating GPU's for compute and resell the ones that did'nt meet quality guidelines to graphics is a good thing todo. Nothing gets lost. But when i saw the 8 pin header and it cant even compete with a lower end graphics card of AMD i'm like stop it already. Just release it, stop the hype, stop picking titles that favor intel graphics. Aim for the 2nd generation and perhaps then it can make a change.
Posted on Reply
#64
cellar door
ravenholdWhen theysay it's better than 3060, well, with raytracing on would be great to have that. If they'll polish drivers for raytracing till launch.
I'm really not sure why people are even mentioning RT - it's not like 3060 can do any meaningful RT... oh wait - there are couple titles and it has to be low RT settings and 1080p with DLSS ON.

RT is pointless to bring up on a card with this performance level.
Posted on Reply
#65
ModEl4
Probably A750 LE will be 28Xe not 24Xe as the rumours was suggesting.
The performance that Intel showed was indicative of that.
Below new info, it seems something like the below to me:
A770 16GB 32Xe $399-379 ($399?)
A770 8GB 32Xe $359-329 ($349?)
A750 8GB 24-28Xe $319-279 ($299?)
A580 8GB 16-24Xe $249-229 ($249?)
A380 6GB 8Xe $139-129 ($129?)
A310 4GB 4-6Xe $109-99 ($99?)

Posted on Reply
#66
NDown
dunno why people are bashing this, marketing aside its a good thing another company has joined the competition
Posted on Reply
#67
ZoneDymo
NDowndunno why people are bashing this, marketing aside its a good thing another company has joined the competition
Read the (my) first comment?
Posted on Reply
#68
ExcuseMeWtf
NDowndunno why people are bashing this, marketing aside its a good thing another company has joined the competition
Not really bashing, more like being apprehensive about Intel's driver quality myself.
If it's not good then they won't be much of a competitor.
Posted on Reply
#70
ModEl4
Although fine in current market condition, at this pricing level it will be decimated by next-gen offers.
Navi 33 will be at least +55% faster in 1080p vs A770.
Posted on Reply
#71
ixi
ModEl4Although fine in current market condition, at this pricing level it will be decimated by next-gen offers.
Navi 33 will be at least +55% faster in 1080p vs A770.
New player with lower raw performance and newbie drivers should cost much lower than this price.

Wondering when even will someone buy intel :D
Posted on Reply
#72
ModEl4
ixiNew player with lower raw performance and newbie drivers should cost much lower than this price.

Wondering when even will someone buy intel :D
For anyone that wants to play newer titles it will be just fine.
Regarding pricing, when Navi33 launches Intel can just adjust pricing based on AMD's pricing and market conditions.
Regarding QHD performance in newer titles A770 will be on average faster than 6650XT and 6650XT has $399 SRP and ≥$349 street price so A770 8GB at $349 for the time being will be just fine!
VGA market needs a third player imo so it will be great for all of us if Intel succeeds.
Probably by Q4 2024 we will have the next Nvidia/AMD architectures and Intel's response (Celestial) will be much closer (Although early an easy napkin prediction is that Intel's top SKU will be at least 2X faster in 4K in relation with Navi33!)
Posted on Reply
#73
ixi
ModEl4For anyone that wants to play newer titles it will be just fine.
Regarding pricing, when Navi33 launches Intel can just adjust pricing based on AMD's pricing and market conditions.
Regarding QHD performance in newer titles A770 will be on average faster than 6650XT and 6650XT has $399 SRP and ≥$349 street price so A770 8GB at $349 for the time being will be just fine!
VGA market needs a third player imo so it will be great for all of us if Intel succeeds.
Probably by Q4 2024 we will have the next Nvidia/AMD architectures and Intel's response (Celestial) will be much closer (Although early an easy napkin prediction is that Intel's top SKU will be at least 2X faster in 4K in relation with Navi33!)
I do agree about third player, but not expecting the performance from intel in dGPU market. All these years they couldn't compete in iGPU segment too... 350€ for 6650 performance after how many months...?

Should we be happy for intel that they in 2022 MAYBE will release powerful enough gpu to run latest titles on highest settings and over 144fps in fhd? Doubt that.

Personally I'll be interested in intel gpu when their price will be 30% lower than amd or nvidia from the same level.

Do intel even provide alternative to nvidia or amd software, like image or video capture and all the other stuff what comes in experience or adrenaline?


But one thing for sure. Intel first lineup with dGPU will cost good fortune in future for collectors even if they will deliver crap products.
Posted on Reply
#74
Stephen.
Solaris17Yeah exactly. I will buy one, because this is techpowerup an enthusiast forum. I am curious about technology and how this will age, and about the "age" in general as Intel coming in as a dGPU competitor in general, not like will their dGPUs compete. Lots of cynics in the thread. Maybe im too much of a dreamer chasing my sense of adventure, my life was way more fun when I didn't treat hardware purchases like betting on dogs at a race track. Can't wait to "throw my money away" on the experience.
Couldn't have said it better,..
Posted on Reply
#75
ModEl4
ixiI do agree about third player, but not expecting the performance from intel in dGPU market. All these years they couldn't compete in iGPU segment too... 350€ for 6650 performance after how many months...?
A770 8GB will be faster in QHD vs 6650XT in newer titles and it won't be 349€ (if the price is between $349-329 probably around €399-369?)
RX6650XT dropped in Mindfactory at €399 before 2 days, probably in order to match the recent RTX 3060 €399 drop.
ixiPersonally I'll be interested in intel gpu when their price will be 30% lower than amd or nvidia from the same level.
Then clearly it's not for you.
There is no reason Intel to discount so much a product when it is at the same performance level as the competition.
All the previews that we show from China and the new ones from GamersNexus etc, didn't have major software bugs apparently on the games tested, that's a good sign and i expected worse situation to tell you the truth.
Performance level can easily be balanced with the right price and you will find a lot of people to have much greater tolerance (-30%) than you.
Posted on Reply
Add your own comment
Dec 18th, 2024 07:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts