Tuesday, June 7th 2022

Intel Arc A730M Tested in Games, Gaming Performance Differs from Synthetic

Intel Arc A730M "Alchemist" discrete GPU made headlines yesterday, when a notebook featuring it achieved a 3DMark TimeSpy score of 10,138 points, which would put its performance in the same league as the NVIDIA GeForce RTX 3070 Laptop GPU. The same source has taken the time to play some games, and come up with performance numbers that would put the A730M in a category lower than the RTX 3070 Laptop GPU.

The set of games tested is rather small—F1 2020, Metro Exodus, and Assassin's Creed Odyssey, but the three are fairly "mature" games (have been around for a while). The A730M is able to score 70 FPS at 1080p, and 55 FPS at 1440p in Metro Exodus. With F1 2020, we're shown 123 FPS (average) at 1080p, and 95 FPS avg at 1440p. In Assassin's Creed Odyssey, the A730M yields 38 FPS at 1080p, and 32 FPS at 1440p. These numbers roughly translate to the A730M being slightly faster than the desktop GeForce RTX 3050, and slower than the desktop RTX 3060, or in the league of the RTX 3060 Laptop GPU. Intel is already handing out stable versions of Arc Alchemist graphics drivers, and the three are fairly old games, so this might not be a case of bad optimization.
Sources: Golden Pig Update (Weibo), VideoCardz
Add your own comment

66 Comments on Intel Arc A730M Tested in Games, Gaming Performance Differs from Synthetic

#51
Mussels
Freshwater Moderator
I dont think anyone is shocked that intel would be better in synthetics than in real games


hopefully they can work some magic in drivers, but the game engines were simply designed for nvidia or AMD, nothings optimised for their graphics cards
Posted on Reply
#52
ModEl4
If you check the 3Dmark-games performance level placement difference in relation with the competition, it's like around -30%-35% (or ARC is achieving 1.4X-1.5X more performance in synthetics vs games performance level)
So if the source that told the leakers the original 3070Ti story, was based on 3Dmark numbers then the top ARK would be at games 3070Ti -30-35% which gives RX 6650XT-RX 6600XT level of performance.
Now imagine Intel matching at $399 the RX6650XT for the 8GB version since it offers the same performance plus matrix A.I. cores plus better media engine and then +$10/GB for the 16GB version at $479, after all memory always sell products right?
And at a later date when AMD launch RDNA3, Intel lowers the prices (less embarrassing imo than the RX5700/5700XT $449-$399-$349 saga)
Now that would be something!
Just joking!!!
Posted on Reply
#53
Vayra86
ModEl4If you check the 3Dmark-games performance level placement difference in relation with the competition, it's like around -30%-35% (or ARC is achieving 1.4X-1.5X more performance in synthetics vs games performance level)
So if the source that told the leakers the original 3070Ti story, was based on 3Dmark numbers then the top ARK would be at games 3070Ti -30-35% which gives RX 6650XT-RX 6600XT level of performance.
Now imagine Intel matching at $399 the RX6650XT for the 8GB version since it offers the same performance plus matrix A.I. cores plus better media engine and then +$10/GB for the 16GB version at $479, after all memory always sell products right?
And at a later date when AMD launch RDNA3, Intel lowers the prices (less embarrassing imo than the RX5700/5700XT $449-$399-$349 saga)
Now that would be something!
Just joking!!!
So we'd still pay too much for shite drivers and no competition at the price point. Even if this isn't a joke, its a joke :D

I mean really, 399 for a bottom mid tier GPU... neh. Hard. Pass. Even with the full feature set Nvidia has now, I wouldn't even think of it.
Posted on Reply
#54
Mussels
Freshwater Moderator
Vayra86So we'd still pay too much for shite drivers and no competition at the price point. Even if this isn't a joke, its a joke :D

I mean really, 399 for a bottom mid tier GPU... neh. Hard. Pass. Even with the full feature set Nvidia has now, I wouldn't even think of it.
But if you went to buy a new laptop and the only way to get a the latest Intel CPU came with these GPUs, would you consider it?

Pretty sure they know they cant compete, so they're going to push the OEM's to get the sales and profits anyway (and of course, over time work on drivers and faster models)
Posted on Reply
#55
ModEl4
Vayra86So we'd still pay too much for shite drivers and no competition at the price point. Even if this isn't a joke, its a joke :D

I mean really, 399 for a bottom mid tier GPU... neh. Hard. Pass. Even with the full feature set Nvidia has now, I wouldn't even think of it.
It's probably what @Mussels said, if the performance achieved is that low, the first to know about it is Intel of course, so in order to maximize their profit, the strategy is exactly what they are doing, launching OEM & China first getting the feeling of what the channel will tolerate based on the negotiations that occurred and what they achieved to actually sell and at what price on the OEM channel and then delay as much as possible the DYI international launch/reviews in order to try to fix drivers/software issues in order not to have very bad reviews that will adversely effect the future OEM sales (it's all about OEM).
I hate being a smartass, but it seems that the job that is happening in the software department is just not enough, either they must hire some more people (if the delay in order to achieve acceptable driver/sw level is because they need more time or someone must go if the problem is about execution and leadership)
Regarding reviews, the task although very difficult is pretty clear, they have to identify the major publications and what games they are testing and focus there, eliminating the major problems regarding stability/visual artifacts etc and on a second level extract as much performance as possible on these specific titles.
It can't be more than 50 games really, we have between Q1 2022 that they shipped final silicon and at the end of Q3 that the DYI international launch with review samples may occure (end of Q2 i suspect is all about China and OEMs like the mobile launch?) 25+ weeks, so half week worth of working hours for the whole Intel S/W team per title, or a week worth of working hours if they focus on the major 25 games that most publications use or 25 games that causing them the most troubles.
And to tell you the truth it is like half a month worth of working hours for the whole Intel s/w team per title since i imagine they started a lot earlier than end of Q1 that they shipped final silicon to OEMs.
Maybe I'm being too harsh especially since i don't know the internals but something must change that's for sure, they can't be happy with the results they're achieving!
Posted on Reply
#56
R0H1T
Musselsso they're going to push the OEM's to get the sales and profits anyway
I believe they don't have the kind of pull they had 2 years back, probably not even half as compared to a decade back! Unless they're paying copious amounts of kickback contra revenue kind of stuff it will be hard for them to sell too many of their substandard GPU's ~ which is assuming they perform horribly.
Posted on Reply
#57
Mussels
Freshwater Moderator
R0H1TI believe they don't have the kind of pull they had 2 years back, probably not even half as compared to a decade back! Unless they're paying copious amounts of kickback contra revenue kind of stuff it will be hard for them to sell too many of their substandard GPU's ~ which is assuming they perform horribly.
Remember the rumours that the GPU's only worked with specific intel CPU's and motherboards?
Intel's first Xe graphics card only works with Intel CPUs and mobos | PC Gamer

If they do this, it makes it super easy to sell all intel parts in certain market segments
Posted on Reply
#58
R0H1T
No never heard of this one, but yeah if Intel really wants to they can definitely force the OEM's to do their bidding that's for sure!
Posted on Reply
#59
AusWolf
MusselsBut if you went to buy a new laptop and the only way to get a the latest Intel CPU came with these GPUs, would you consider it?
Honestly, if the price is right, I would. Once you have your laptop, and fire up some games, you don't feel that it is 10% behind the other laptop that you didn't buy. You only feel your own gaming experience.
Posted on Reply
#60
ravenhold
When is Intel ARC releasing for desktop?
Posted on Reply
#62
bug
MusselsI dont think anyone is shocked that intel would be better in synthetics than in real games


hopefully they can work some magic in drivers, but the game engines were simply designed for nvidia or AMD, nothings optimised for their graphics cards
Well, barely anyone can optimize for Intel atm. And the ones that so (partners and such), probably don't put a lot of resources into that anyway.
Posted on Reply
#63
Mussels
Freshwater Moderator
bugWell, barely anyone can optimize for Intel atm. And the ones that so (partners and such), probably don't put a lot of resources into that anyway.
Sorry, i thought the next step was implied - FUTURE games will be, but current and past ones can't be and will rely on driver tweaks
Posted on Reply
#64
ravenhold
If Intel plans to relase ARC next month, they should announce it.
Posted on Reply
#65
bug
MusselsSorry, i thought the next step was implied - FUTURE games will be, but current and past ones can't be and will rely on driver tweaks
It's the same for AMD and Nvidia: games released before Ampere, will not be optimized for Ampere, but will rely on proper driver profiles instead. Intel doesn't have the best track record here, but let's stay positive about it.
Posted on Reply
#66
AusWolf
bugIt's the same for AMD and Nvidia: games released before Ampere, will not be optimized for Ampere, but will rely on proper driver profiles instead. Intel doesn't have the best track record here, but let's stay positive about it.
Also, just because games aren't optimised for a certain architecture it doesn't mean they can't run well on it.
Posted on Reply
Add your own comment
Jan 10th, 2025 18:40 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts