Friday, July 15th 2022

Intel Previews Arc A750 Graphics Card Performance

Intel has decided to share some more details on its upcoming Arc A750 graphics card, the one and same that appeared briefly in a Gamer Nexus video just the other day. The exact product being previewed is the Intel Arc A750 Limited Edition graphics card, but the company didn't reveal any specifications of the card in the video it posted. What is revealed, is that the card will outperform a NVIDIA GeForce RTX 3060 card at 1440p in the five titles that Intel provided performance indications and average frame rates for. The five games are F1 2021, Cyberpunk 2077, Control, Borderlands 3 and Fortnite, so in other words, mostly quite demanding games with F1 2021 and Fortnite being the exceptions.

The only game we get any kind of insight into the actual performance of in the video, is Cyberpunk 2077, where Ryan Shrout details the game settings and the actual frame rate. At 2560 x 1440, using high settings, the Arc A750 delivers 60.79 FPS, with a low of 50.54 FPS and a max of 77.92 FPS. Intel claims this is 1.17 times the performance of an EVGA GeForce RTX 3060 XC Gaming 12G graphics card. At least Intel didn't try to pull a fast one, as the company provided average frame rates for all the other games tested as well, not just how many times faster the Intel card was and you can see those results below. The test system consisted of an Intel Core i9-12900K fitted to an ASUS ROG Maximus Z690 Hero board, 32 GB of 4800 MHz DDR5 memory and a Corsair MP600 Pro XT 4 TB NVMe SSD, as well as Windows 11 Pro. According to the video, the Arc graphics cards should launch "this summer" and Intel will be releasing more details between now and the launch.
Source: Intel
Add your own comment

99 Comments on Intel Previews Arc A750 Graphics Card Performance

#26
Garrus
The problem is the A750 needs to be $300 at most but they won't announce the price...
Posted on Reply
#27
Assimilator
64KAt this point I would just like them to be competent enough to release the Arcs to the rest of the world besides China. The rumor is that the release for the USA is delayed until Fall. Not sure about the rest of the world. Not that I would buy one anyway but there does seem to be some interest in them.
There's also interest in horrific train crashes.
Posted on Reply
#28
eidairaman1
The Exiled Airman
MentalAcetylide60The competition vs. AMD & Nvidia will 0/be s n, a 6owe dcamwelcomed and help with pricing a bit, however, I don't think Intel will come close enough just yet to have much of an effect. Maybe 5 or so years from now, but not in the present or near future.
AMDs pricing has came down, so i mean 300 aint bad for a 6600/xt
Posted on Reply
#29
Testsubject01
Solaris17If the GN video was any indication Tom (The engineer) knows this and basically said as much.

Also, some refreshing absence of PR talk regarding performance targets and expectations regarding the Driver.
Intel, at least to some point, got the price memo and set a revised $129-$139 MSRP for the A380 instead of $150 at their initial release in China.
Posted on Reply
#30
efikkan
Bomby569I'm not that pessimistic as you guys, AMD took ages to get a competing gpu past the middle range. Intel done a good job at first try, now bring pricing down and improve drivers.
Up to the Radeon 300 series, AMD(/ATI) was participating in the high-end segment. It was just during the RX 400/500/5000 series they only had products in the lower mid-range and below.
Posted on Reply
#31
Solaris17
Super Dainty Moderator
Testsubject01

Also, some refreshing absence of PR talk regarding performance targets and expectations regarding the Driver.
Intel, at least to some point, got the price memo and set a revised $129-$139 MSRP for the A380 instead of $150 at their initial release in China.
Yeah exactly. I will buy one, because this is techpowerup an enthusiast forum. I am curious about technology and how this will age, and about the "age" in general as Intel coming in as a dGPU competitor in general, not like will their dGPUs compete. Lots of cynics in the thread. Maybe im too much of a dreamer chasing my sense of adventure, my life was way more fun when I didn't treat hardware purchases like betting on dogs at a race track. Can't wait to "throw my money away" on the experience.
Posted on Reply
#32
ModEl4
Testsubject01

Also, some refreshing absence of PR talk regarding performance targets and expectations regarding the Driver.
Intel, at least to some point, got the price memo and set a revised $129-$139 MSRP for the A380 instead of $150 at their initial release in China.
The initial release in China was $129+VAT, i posted about it back then when we saw the +25% performance/$ vs RX6400 figures.
Posted on Reply
#33
trsttte
GarrusThe problem is the A750 needs to be $300 at most but they won't announce the price...
Isn't the 3060 12gb with which it's competing 400$? 100$ discount at launch seems a bit much, maybe 50$ because you're kind of beta testing after all
Posted on Reply
#34
ModEl4
The performance that A750 limited edition achieving in those titles are way higher than what the difference vs Gunnir photon A380 should be.
Either in the original GamersNexus review results the Gunnir wasn't set to run at 2450MHz (official turbo spec of the model and 92W TBP) but at less (2GHz perhaps to be a better representative of all the A380 models? / 2GHz is the official (reference) turbo frequency if A380 is powered only through pci-express's 75W) or A750 LE isn't 24Xe design but 28Xe.(probably the former-meaning 2GHz turbo in GamersNexus preview)
I can't find any other explanation why the numbers are so high vs what A380 can do.
Edit: i just watched GamersNexus A750 video and at 14:16 Tom Petersen says showing to the A380 model that it may be 75W and latter 2GHz, if this is the case (Tom Petersen didn't seem too sure about it) it's strange, in the previous video (A380 preview) Steve Burke didn't mention anything about it!
Posted on Reply
#35
R0H1T
trsttteIsn't the 3060 12gb with which it's competing 400$? 100$ discount at launch seems a bit much, maybe 50$ because you're kind of beta testing after all
3060 is overpriced right now besides Nvidia being the market leader commands a premium, not unlike Intel did only 3 years back in the CPU space!
Posted on Reply
#36
Denver
efikkanThe games are not optimized for Intel, these are just the games which happen to perform better.


Considering that Nvidia and AMD are about to release their new generations in the next few months, so a price of ~$200 would be decent if the TDP is also good?


It was surely a good card, even though some claimed Nvidia launched it to make AMD look bad.
Back when 1080 Ti was about to launch, I was planning to buy one, and since I had one GPU fail I bought a 1060 as a temporary solution. Then prices and availability went crazy, and I'm still rocking my 1060 :p
I dare say that the launch price of the new generation will have such a big jump as to not affect the competitiveness of intel products, and even the current line of both AMD and Nvidia. lol
Posted on Reply
#38
Dristun
Might need to jump through a couple of hoops to get one in my region but now I'm interested again. Means A770 might just land around 3070 too! Yeah, in most games it's gonna blow but hey, I like a good underdog story, even if the underdog is somehow Intel.
Posted on Reply
#39
Panther_Seraphin
What I found interesting from a lot of the graphs was the strong .1% an 1% outliers that were there. SO if there is driver optimisations to be had I can see them being similar to how RX6xxx from AMD and how they got free performance just through driver maturity.

How much they will get is the big question but hopefully Intel will see how their 1st generation cards are utilised and then optimise the architecture perhaps in Celeste.
Posted on Reply
#40
trsttte
I'm waiting to see some tests on linux with DXVK, might help a lot with the driver woes on older apis.
Posted on Reply
#41
Verpal
Performance results shown here are from a small subset of the games, that work very well with Intel® Arc™ and the Alchemist architecture. I’m not asserting that ALL GAMES will show these results, but it’s a view of what Intel Arc A-series cards are capable of with the right software and engineering enablement.
I think that's the main takeaway from this story, a decently honest claim from Intel.
Posted on Reply
#42
eidairaman1
The Exiled Airman
ExcuseMeWtfDriver quality is the question here.
Just like the i740 and their igps
Posted on Reply
#43
mechtech
So about same ballpark as 6600XT. So should be around the same price then also.
Posted on Reply
#44
bonehead123
Your fault, my fault, nobody's fault...

it won't matta...

It's STILL gonna be DOA if the price (and drivers) aint right - Big Jake McCandles (aka John Wayne) sorta :)
Posted on Reply
#45
oxrufiioxo
efikkanUp to the Radeon 300 series, AMD(/ATI) was participating in the high-end segment. It was just during the RX 400/500/5000 series they only had products in the lower mid-range and below.
I think people have a hard time remembering because prior to RDNA2 AMD struggled to compete in the high end post 2013... They made a bad bet on HBM crippling the Fury X which might have ended up a decent product had it shipped with 8GB and after that they gave up trying to compete in the high end till rdna2 shipped.
Posted on Reply
#46
phanbuey
right how many years did the rx480 last?
Posted on Reply
#47
tussinman
trsttteIsn't the 3060 12gb with which it's competing 400$? 100$ discount at launch seems a bit much, maybe 50$ because you're kind of beta testing after all
Mid-ranged Nvidia cards are at a standstill right now, there still asking Spring Crypto prices and as a result products aren't selling (3050, 3060, and 3060Ti are all collecting dust at my local brick n mortar store).

A 3060 is not a $400 card, its basically a RTX 2070 from 2018 (nobody is buying them right now at the inflated price)

A better comparison would be the RX 6600 (which is almost identical to a 3060) is currently selling for $300 and the next tier up (6600XT) can be had for as low as $340
Posted on Reply
#48
Count von Schwalbe
Nocturnus Moderatus
2 points nobody has mentioned - if these fail hard, they are going to be worth $$$ some years in the future. The DG1 most likely will anyways. Also, concerns about driver quality are a double-edged sword. Sure, they might not be great, but that also means that the performance is only going to increase from here.
Posted on Reply
#49
AusWolf
AssimilatorBecause of course their A770/A780 is going to be an RTX 3090 Ti killer LMAO.

These GPUs are trash, no two ways about it. I'm gonna enjoy laughing at the people posting "I bought an Intel GPU but it runs slow in <random game> REEEEEE". Then in a few more years, another round of pointing and laughing at the "Has Intel abandoned Arc, why are there no driver updates" posts.
Because killing the 3090 Ti is the only way to launch a product line. Do people only buy $1500+ GPUs? Come on!
Posted on Reply
#50
Sabotaged_Enigma
So much more power is needed to compete against 6600 XT, lol. Conclusion: 6600 XT is the best main-stream model.
Btw, it seems like A780 is gonna perform so well.
Btw x2, what about Intel graphics driver? It sucks, bro, work harder and stop cheating in 3DMark, lolllllll
Posted on Reply
Add your own comment
Dec 18th, 2024 07:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts