Wednesday, June 22nd 2022

Intel Arc A380 Desktop GPU Does Worse in Actual Gaming than Synthetic Benchmarks

Intel's Arc A380 desktop graphics card is generally available in China, and real-world gaming benchmarks of the cards by independent media paint a vastly different picture than what we've been led on by synthetic benchmarks. The entry-mainstream graphics card, being sold under the equivalent of $160 in China, is shown beating the AMD Radeon RX 6500 XT and RX 6400 in 3DMark Port Royal and Time Spy benchmarks by a significant margin. The gaming results see it lose to even the RX 6400 in each of the six games tested by the source.

The tests in the graph below are in the order: League of Legends, PUBG, GTA V, Shadow of the Tomb Raider, Forza Horizon 5, and Red Dead Redemption 2. We see that in the first three tests that are based on DirectX 11, the A380 is 22 to 26 percent slower than an NVIDIA GeForce GTX 1650, and Radeon RX 6400. The gap narrows in DirectX 12 titles SoTR and Forza 5, where it's within 10% slower than the two cards. The card's best showing, is in the Vulkan-powered RDR 2, where it's 7% slower than the GTX 1650, and 9% behind the RX 6400. The RX 6500 XT would perform in a different league. With these numbers, and given that GPU prices are cooling down in the wake of the cryptocalypse 2022, we're not entirely sure what Intel is trying to sell at $160.
Sources: Shenmedounengce (Bilibili), VideoCardz
Add your own comment

190 Comments on Intel Arc A380 Desktop GPU Does Worse in Actual Gaming than Synthetic Benchmarks

#77
Vayra86
Bomby569You don't get optimized igpu drivers, you get drivers. It's a new game, it's not an excuse, it's reality. How long as AMD doing GPU's? they still mess up the drivers.
CPU is another story, we all know the previous management were a disgracee. They were the 1st to get the new EUV machines from ASML. It doesn't make them better at what they do, but they are doing things differently now.
There is an occasional mess up in drivers in an otherwise pretty refined driver regime aimed at gaming (and it sure as hell is optimized, I'm sure you remember NvInspector, you can tweak a million things per game) and there is a driver regime that was never aimed at gaming, and is to this date incapable of delivering performance expected based on hardware available. That bit is key. Efficiency is king. The company that extracts the highest FPS per square mm of die space is going to win the race, and if it does so at decent or lower power budget, it wins doubly so.
Posted on Reply
#78
chstamos
I honestly wonder what qualcomm would deliver with a "discrete adreno" of some sort, if it allocated intel's discrete gpu budget to it. I bet it would be better.
Posted on Reply
#79
AusWolf
If it has enough VRAM to actually do raytracing (unlike the 6400 / 6500 XT), that's something... I guess?
Posted on Reply
#80
efikkan
john_3DMark performance shows possible potential. Games show current reality.
For clarity, if you think synthetic benchmarks illustrate performance of an optimal driver, then absolutely no. Synthetic benchmarks will only be relevant if games have very similar characteristics.

I honestly see no usefulness for consumers to look at synthetic game benchmarks at all, except perhaps if there are new technologies to just get a sneak peek.
kajsonImmature drivers might help a little, but it seems just as likely the drivers have been " optimized " for synthetic benchmarks and it won't ever perform on 1650 level even.
Most gamers have many misconceptions about how optimization works.
Drivers are not (normally) optimized for specific applications/games/benchmarks etc., except the few times they are cheating*.

If we see games scale in one way and synthetics scale in another, then it's mostly due to hardware resources. We should not expect games to scale badly if they use the same graphics APIs. And if there are exceptions, those would probably boil down to certain parameters to functions causing extra work, and such should be identified quickly and fixed. But if we see general performance differences, then it's not due to "immature drivers".

*)
Drivers have been known to cheat certain synthetic benchmarks and sometimes even games, like replacing shaders to gain performance with "minor" degradation in visual quality. Overriding API-calls is also possible, but will cause driver overhead, so this is super rare. One very old documented example of cheating is this.
JalleRWell this time they have sendt a card on the market, thats 100% better than last time :D
More like infinitely better! :P
AlwaysHopePoor performance from Intel gpu cause' game devs have already optimised game codes for either AMD or Nvidia in those game titles mentioned by the OP.
Games are not optimized for specific hardware, at least not the way you think.
Doing so would require either 1) utilize very specific low-level hardware differences unique to one hardware maker or 2) utilize separate code paths and low-level APIs targeting each specific GPU-family to optimize for.
(1) is "never" done intentionally and 2) is only used in very specific cases).
Practically all game engines these days are using very bloated and abstracted engines. Most games today contain little to no low-level code at all, and quite often use an abstracted game engine (often third-party).
Posted on Reply
#81
AusWolf
efikkanFor clarity, if you think synthetic benchmarks illustrate performance of an optimal driver, then absolutely no. Synthetic benchmarks will only be relevant if games have very similar characteristics.

I honestly see no usefulness for consumers to look at synthetic game benchmarks at all, except perhaps if there are new technologies to just get a sneak peek.
Exactly. Games are designed to use your hardware as a whole. Synthetics are designed to test and stress one very specific aspect of your hardware (e.g. the GPU cores only). That's why my CPU usage in Time Spy graphics tests is near zero. Games and synthetics will never be the same.
Posted on Reply
#82
Dr. Dro
It's been funny to read this thread and see 20 year old shenanigans from both GPU companies brought up, and how they remain amazingly relevant today :D

It is also equally unfortunate to see how clubism seems to be alive and well to an extent that the entry of a third player is being seeing as a hostility and an act to be disregarded when we as consumers should be cheering on a new product from a new manufacturer. I wish Intel well and Raja Koduri's team success in developing Arc into a fine series of dedicated graphics, and if I can, I will definitely be purchasing one, even if just a basic model.
Posted on Reply
#83
TheoneandonlyMrK
AlwaysHopePoor performance from Intel gpu cause' game devs have already optimised game codes for either AMD or Nvidia in those game titles mentioned by the OP.
To be fair, this is probably the best reply, they very much are optimised for one maybe two architectures and few are for Intel, there was a dirt game (codemaster dirt rally I think) made for and in collaboration with Intel, it ran very well on Intel tbf, be nice to see how it runs there.
Posted on Reply
#84
ARF
Robin SeinaWhy? I see that the price for 6500 XT is around 155-190 USD, while RTX 3050 is over 300 USD. GTX1650 begins about same or 10% above 6500XT, with most models over 200USD. And as can be seen on screenshots above in this thread, 6500XT a winner performance/USD category (of roughly same priced cards)
Because it offers absolutely the same raster performance as the 2019 RX 5500 XT, minus the hardware acceleration for media content.
It is a sidegrade or downgrade which no one needs or has ever asked for.
Posted on Reply
#85
catulitechup
RipcordSimple fact is the rx6500xt has no real competition in its price bracket. its a solid performer. AMD released a good product at a good price


self conviction is only left when a product is so hard scam in various levels like:

-price 200us (what a trash)

-4gb of 64bit vram (for a 200us what a trash again)

-seriously cutted pci-e lanes (meanwhile other products in price range have 16x lanes and
before company product like rx 5500 xt comes with 8x lanes, what a trash again)

-cutted encode and encode capabilities (meanwhile other products in price range have encode and decode capabilities and before company product like rx 5500 xt have encode and decode capabilities, what a trash again)

back to theme still with a380 games performance think about intel need launch arc now for try improve drivers with users feedback
with proper price according actual game performance, if price stay between 120 to 130us max will be very interesting product because offer 6gb 96bit of vram, 8x lanes pci-e and complete decode and encode capabilities included av1 format

:)
Posted on Reply
#86
bug
Lew ZealandThe A380 doesn't compete in that segment, Intel has higher core parts for that (though those may as underwhelming as this low-end one).

Intel A380 - 1024 cores
AMD 6700 - 2560 cores
Nvidia 3060Ti - 4864 cores

This competes in cores with the 1650/super RX6400/6500. And loses.
Are there higher specced cards? I thought A380 is their top offering.
Posted on Reply
#87
ARF
bugAre there higher specced cards? I thought A380 is their top offering.
No, 3 stands for the lowest end. They should release Arc A780. Intel Arc A780 Specs | TechPowerUp GPU Database

3 for low end, 5 for mid range and 7 for high end. I guess the 8 stands for top offer in the corresponding tier - in the case 3.
Posted on Reply
#88
defaultluser
RipcordIgnore them, they don't have the card and don't know what they are talking about. Simple fact is the rx6500xt has no real competition in its price bracket. its a solid performer. AMD released a good product at a good price
not at $200 for a 64-bit card.

The 6600 is already barely enough at 128-bits *( because , very small die size means noticeably less Infinity Cache,, and there is a minimum size required for your nearly doubling of performance per memory tick (SEE RX 6900 xt, vs 5700 xt)!

The existing 128-bit RDNA cards are faster than the 6500 Xt! ( if the cache actually worked on a cad this small, then the performance would be in the range of the 3050, not 10% slower than a 5500 XT!
Posted on Reply
#89
Lew Zealand
bugAre there higher specced cards? I thought A380 is their top offering.
Like ARF said, this is their *lowest* specced offering. Now, I still don't anticipate the Arc 770 or 780 to match the 3070 but matching any of those upper-tier GPUs is not the point for a freshman effort. Instead, offering a decent midrange GPU at a decent price should be the target. We'll see if Intel is up to that challenge.
Posted on Reply
#90
ARF
Intel's naming scheme is quite interesting. I guess the next generation will have B380, B580 and B780 cards. Of course, B790, B550 and B310 products are also likely in order to widen the product stack.
Posted on Reply
#91
bug
Lew ZealandLike ARF said, this is their *lowest* specced offering. Now, I still don't anticipate the Arc 770 or 780 to match the 3070 but matching any of those upper-tier GPUs is not the point for a freshman effort. Instead, offering a decent midrange GPU at a decent price should be the target. We'll see if Intel is up to that challenge.
Ah, ok, then there's still hope.

And I am really curious about the actual performance. Because there's no reason the performance gap between synthetics and games is so big. Either Intel is somehow cheating in tests (not proven so far, cheating usually affects image quality) or the potential is actually there, but for some reason it's not realized in games so far.
Posted on Reply
#92
AusWolf
catulitechup

self conviction is only left when a product is so hard scam in various levels like:

-price 200us (what a trash)

-4gb of 64bit vram (for a 200us what a trash again)

-seriously cutted pci-e lanes (meanwhile other products in price range have 16x lanes and
before company product like rx 5500 xt comes with 8x lanes, what a trash again)

-cutted encode and encode capabilities (meanwhile other products in price range have encode and decode capabilities and before company product like rx 5500 xt have encode and decode capabilities, what a trash again)

back to theme still with a380 games performance think about intel need launch arc now for try improve drivers with users feedback
with proper price according actual game performance, if price stay between 120 to 130us max will be very interesting product because offer 6gb 96bit of vram, 8x lanes pci-e and complete decode and encode capabilities included av1 format

:)
There is one thing the 6500 XT does right: my Asus TUF is so quiet, even when overclocked! It literally competes with passive cards in noise. It's such a delight to game on a completely silent PC! :cool:
Posted on Reply
#93
ARF
AusWolfThere is one thing the 6500 XT does right: my Asus TUF is so quiet, even when overclocked! It literally competes with passive cards in noise. It's such a delight to game on a completely silent PC! :cool:
I would like to know more about this... I guess you can have a quiet PC with a higher end graphics card like the Radeon RX 6800 XT in the correct PC case. Be Quiet.
Posted on Reply
#94
AusWolf
ARFI would like to know more about this... I guess you can have a quiet PC with a higher end graphics card like the Radeon RX 6800 XT in the correct PC case. Be Quiet.
In a large case with lots of airflow, maybe. It's a shame I don't like large cases (m-ATX is my top preference).
Posted on Reply
#95
Crackong
It is so funny that some of the ppl treated Intel as a 'New comer' in the graphics segment despite the fact that Intel had 20 years of iGPU experiences.

They just need to work hard.
We need more actual products, not fancy hype train PPTs.
Posted on Reply
#96
lexluthermiester
Bomby569one word: drivers
That's what I'm thinking. Drivers need to be optimized.
Posted on Reply
#97
mplayerMuPDF
I am not an Intel fan at all (in fact, I got rid of my all my Intel CPU-powered computers after Meltdown etc) but I really do not get the people who are hating on this dGPU effort of theirs. People should be overjoyed that a new player is entering the market when consumers are getting shafted by ridiculous stunts such as the RX 6500 "XT" with x4 PCIe and no hardware encoding (when even my low-end Polaris card that I bought for $90 open box on eBay has that). Who cares if the initial gaming performance is a bit underwhelming? AMD had massive driver issues when they have been making dGPUs for literally decades (well, technically ATI). Even if it really does end up underperforming, Intel will simply cut the price because they can afford to do so to annoy AMD and Nvidia and then they *will* sell, just in a different performance tier. It is a practically a dream come true to see competition return to the low-end segment. Not everyone wants or can afford a mid or high end card, especially with the current ridiculous power consumption (combined with a global energy crisis). And as a Linux user I know that even if the Windows drivers end up not being the best, it will have very good Linux support (including OpenCL).
Posted on Reply
#98
AlwaysHope
efikkan....

Games are not optimized for specific hardware, at least not the way you think.
Doing so would require either 1) utilize very specific low-level hardware differences unique to one hardware maker or 2) utilize separate code paths and low-level APIs targeting each specific GPU-family to optimize for.
(1) is "never" done intentionally and 2) is only used in very specific cases).
Practically all game engines these days are using very bloated and abstracted engines. Most games today contain little to no low-level code at all, and quite often use an abstracted game engine (often third-party).
Of course they are not optimized for ONLY one specific hardware component line like Nvidia gpus for example. But have you ever played one of the most famous RPG games of the 2010s - Skyrim?
That game is optimized for Nivdia gpus, I had at first HD 7870 & then upgraded to R9 Nano back in the day, & AMD drivers were always struggling to keep gameplay smooth n' consistent everywhere in that AAA rated & supremely popular game. Now anyone can criticise how crappy the game engine was in the fist place, but that's getting into another argument not relevant for this thread.
Posted on Reply
#99
Minus Infinity
regs6500 XT is a disaster you shouldn't buy.
True but the A380 is 6400 trash performance or worse. Raj is finished!
Posted on Reply
#100
AusWolf
mplayerMuPDFI am not an Intel fan at all (in fact, I got rid of my all my Intel CPU-powered computers after Meltdown etc) but I really do not get the people who are hating on this dGPU effort of theirs. People should be overjoyed that a new player is entering the market when consumers are getting shafted by ridiculous stunts such as the RX 6500 "XT" with x4 PCIe and no hardware encoding (when even my low-end Polaris card that I bought for $90 open box on eBay has that). Who cares if the initial gaming performance is a bit underwhelming? AMD had massive driver issues when they have been making dGPUs for literally decades (well, technically ATI). Even if it really does end up underperforming, Intel will simply cut the price because they can afford to do so to annoy AMD and Nvidia and then they *will* sell, just in a different performance tier. It is a practically a dream come true to see competition return to the low-end segment. Not everyone wants or can afford a mid or high end card, especially with the current ridiculous power consumption (combined with a global energy crisis). And as a Linux user I know that even if the Windows drivers end up not being the best, it will have very good Linux support (including OpenCL).
I would completely agree with you if I could see those Intel cards anywhere. To have competition, you have to sell something.
Posted on Reply
Add your own comment
Dec 19th, 2024 01:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts