Thursday, June 23rd 2022

Intel's Arc A380 Performs Even Worse With an AMD CPU

According to fresh benchmark numbers from someone on bilibili, Intel's Arc A380 cards perform even worse when paired with an AMD CPU compared to when paired with an Intel CPU. The card was tested using an AMD Ryzen 5 5600 on an ASUS TUF B550M motherboard paired with 16 GB of DDR4 3600 MHz memory. The Intel system it was tested against consisted of a Core i5-12400 on an ASUS TUF B660M motherboard with the same type of memory. Both test systems had resizable BAR support set to auto and above 4G decoding enabled. Windows 11 21H2 was also installed on both systems.

In every single game out of the 10 games tested, except League of Legends, the AMD system was behind the Intel system by anything from a mere one percent to as much as 15 percent. The worst performance disadvantage was in Forza Horizon 5 and Total War Three Kingdoms, both were 14 to 15 percent behind. The games that were tested, in order of the graph below are: League of Legends, Dota 2, Rainbow 6 Extraction, Watch Dogs Legions, Far Cry 6, Assassin's Creed Valhalla, Total War Three Kingdoms, Shadow of the Tomb Raider, CS:GO and Forza Horizon 5. For comparison, an NVIDIA GeForce GTX 1650 was also used, but only tested on the Intel based system and the Arc A380 only beat it on Total War Three Kingdoms, albeit by a seven percent margin. It appears Intel has a lot of work to do when it comes to its drivers, but at last right now, mixing Intel Arc graphics cards and AMD processors seems to be a bad idea.
Sources: billibilli, via @greymon55
Add your own comment

77 Comments on Intel's Arc A380 Performs Even Worse With an AMD CPU

#51
Dave65
Wouldn't that be more about driver optimization?
Posted on Reply
#52
TheinsanegamerN
Dave65Wouldn't that be more about driver optimization?
That's a big part. The card underperforms on both intel and AMD, and it's all over the place for framerates that weve seen so far. The official reviews are gonna be a bloodbath.
Posted on Reply
#53
Aldain
why are we still on this trash gpu?

trash is trash .. the end
Posted on Reply
#54
mb194dc
GPUs from AMD and Nvidia are coming back to normal prices for now, better value and much more mature drivers you'd think. Probably explaining why benchmarks are all over the place. Who's going to bother with these cards except possibly system builders?
Posted on Reply
#55
usiname
chrcolukWhat $160 6gig GPUs do Nvidia and AMD offer?
You don't know? A380 start from $210. Also, shit with 6gb ram is shit. Even with 24gb ram this card will cost more and will be slower than 6400, not to mention 6500xt
Posted on Reply
#56
TheLostSwede
News Editor
ModEl4I remember when the usual suspects was spreading 3070Ti level of performance, then it was 3070, now 3060Ti.
But in 4-5 months from now, when Navi33 launches, even if 3060Ti at that time despite ARC's newer drivers still is 20% faster than ARC 780 (this means for ARC 780 +2% more performance in QHD vs RX6600XT, so close to reference RX6650XT) I can easily find price points that can be competitive with next gen because for AMD next gen at less than $399 means RDNA2 according to leaks and I suspect that in 4 months or whenever Navi33 launches AMD wouldn't want to sell the recently launched RX6650XT less than $299.
In this future "next gen" scenario I would find the below prices quite competitive:

ARC-4096 16GB $349
ARC-4096 8GB $299
ARC-3072 12GB $249
ARC-2048 8GB $199
ARC-1024 6GB $129
ARC-768 4GB $99
ARC-512 4GB $79

Now of course can price them higher, although I wouldn't rule out a surprise:

«The actual Chinese MSRP of the GPU is 880 Yuan but after VAT (17%) it comes out to almost 1030 Yuan»

880 Yuan can indicate $129 SRP on extremely positive scenario.

In the wccftech link above also official Intel comparison with GTX1650/ RX6400 (ARC380 frequency 2GHz to cover all the partner card options?)

No-one is selling cards for the official MSRP though, as just with the current generation of AMD cards, the board makers can't make any money with those kind of prices.
Launch day pricing was 3999 RMB, but the cards are supposed to retail for around 1200-1300 RMB.
Posted on Reply
#57
RadeonProVega
I love these tests , nvidia worse , amd worse, etc. if a game is moving at 60fps at 1920x1080, isn't that enough? lol.
Posted on Reply
#58
ModEl4
Xex360I find two issues with this, first GPUs are coming down in price, by the launch of the new gen we could expect them fall further.
Then the drivers, when AMD bought ATI they got the best guys out there (along nVidia's) what does Intel have?
Finally, nVidia is already bribing the market (interestingly even some reviewers, Tom's hardware, now DF, but maybe these are just delusional fanboys), AMD probably is doing it as well but certainly not to to the same level, they also offer a lot of interesting technology for free (my understanding is DLSS is also free but not open source).
Damn and blast forgot about the abomination that is usersbenchmark.
Sure prices are coming down, but the last 2 years it seems spoiled Nvidia & AMD, meaning that they surely more than ever will try to "lose" the least amount of money they can.
So the question is how far are they gonna drop really and you seem from your answer optimistic.
Nothing bad about it, we will see.
Regarding AMD got the best guys of the market when bought ATI regarding software engineers my view is not the same, I'm not saying that now the department isn't good, just that it took them 16 years to be (barely?) competitive with Nvidia (and with the whole market optimizing their games/engines for their architectures out of necessity due to consoles)
Posted on Reply
#60
ModEl4
TheLostSwedeNo-one is selling cards for the official MSRP though, as just with the current generation of AMD cards, the board makers can't make any money with those kind of prices.
Launch day pricing was 3999 RMB, but the cards are supposed to retail for around 1200-1300 RMB.
I don't know about partners profits, but how are you in a position to judge if Intel partners don't make any money or not? (lol sorry just kidding/roasting attempt, usually the margins are small in Intel's case, but small isn't the same as "can't make any money")
Sure you can have your view but you seemed kinda certain in the answer you gave.
isn't the usual practice of Intel doing business through rebates, MDF, etc and it's too early, we don't even have the international launch yet.
When AMD/Nvidia/Intel announce a SRP, they take account the retailer/e-tailer profit (no matter how small), the distributor profit (no matter how small) and the vendor profit and let me tell you it isn't so small in Nvidia's and AMD's case. (same certainty also, jk)
Regarding A380 street prices, it's about supply and demand in the end so I wouldn't rush a judgment regarding partners profits based on that and in any case what's the logic, they have to sell 3X the SRP in order to make any money?
Also AMD's 2022 SRPs is a joke.
$550 for 6750XT, $400 for 6650XT, $200 for 6500XT, how are they gonna sell at these prices when Nvidia's prices finally come down to SRPs depending on the model?
Even now when Nvidia's prices hasn't come to SRPs level, AMD is forced to sell lower than SRPs (in some European countries at least):
6750XT is selling at 3070 level although $50 more expensive, 6700XT is selling nearly 100€ less than 3070 although only $20 less expensive, 6600 is selling at 3050 level despite having $80 more SRP etc.
AMD's "starting price" when dealing with vendors is based on announced SRPs level minus the calculated mentioned above margins and after negotiation based on current market condition, quantity, sales record, etc they reach to an agreement, so AMD's partners they have to fight a lot harder than before in a negotiation with those high SRPs in order to maintain their margins, and if they have to sell for example -15% or more from SRP some models due to market condition (RX6500XT, RX6600 for example), depending on the deal for those particular models the profits i imagine would be very slim.
Intel doesn't seem to have this SRP problem with 880 Yuan+VAT SRP but we will see.
In any case my post was about potential SRPs, not what street prices will end up, I wouldn't know anyway.
Posted on Reply
#61
chrcoluk
I feel with these prices on AMD/Nvidia which are still high but going downwards is just the usual end of gen price drop, 4000 series for Nvidia they will jump again at launch. Whilst with Intel they are start of gen prices.
Posted on Reply
#62
Lew Zealand
TheinsanegamerNMuch simpler answer: intel's driver is likely single threaded, like AMD's, and just like AMD's driver this results in the hardware running better on intel's CPUs.
Hey this is the first I've heard of (some) drivers being single threaded and may explain a few things I've observed and been annoyed about. I can't change it but I'd like to understand it.

Got any tech links for the curious about this?
Posted on Reply
#63
catulitechup
TheLostSwedeNo-one is selling cards for the official MSRP though, as just with the current generation of AMD cards, the board makers can't make any money with those kind of prices.


So cute think them dont gain enough selling vcards

:)
Posted on Reply
#64
HisDivineOrder
Seems pretty obvious that anyone buying these cards are buying an alpha experience and that the drivers will probably be up to snuff right about the time they're releasing a new card that's a lot better off.
Posted on Reply
#65
Assimilator
HisDivineOrderSeems pretty obvious that anyone buying these cards are buying an alpha experience and that the drivers will probably be up to snuff right about the time they're releasing a new card that's a lot better off.
LOL, no. What will happen is that the product will flop and Intel will abandonthe GPU market after a year, and all the suckers who bought their dGPUs will be stuck with trash.
Posted on Reply
#66
OneMoar
There is Always Moar
No shit sherlock because amd cpus are a bit slower in cpu bound 1080p tests then the latest from intel with the exception of the 5800X3D
nothing todo with drivers you are testing at 1080p where you are cpu limited the system with the faster cpu/buss will always win

why tpu keeps posting this clickbait bullshit targeted at stirring up the uninformed is beyond me
Posted on Reply
#67
Unregistered
ModEl4Sure prices are coming down, but the last 2 years it seems spoiled Nvidia & AMD, meaning that they surely more than ever will try to "lose" the least amount of money they can.
So the question is how far are they gonna drop really and you seem from your answer optimistic.
Nothing bad about it, we will see.
Regarding AMD got the best guys of the market when bought ATI regarding software engineers my view is not the same, I'm not saying that now the department isn't good, just that it took them 16 years to be (barely?) competitive with Nvidia (and with the whole market optimizing their games/engines for their architectures out of necessity due to consoles)
Indeed we will see, but with the amount of used GPUs, I'm confident the prices will go down.
Money was an issue, it seems they focused on hardware where they were competitive and at times superior, and that what kept AMD floating.
As for consoles, nVidia and Intel both failed to capitalise on this, AMD on the other hand succeeded, the way I see it, optimising for consoles is by product AMD's strategy. (Though another by product are the powerless Jaguar cores).
Posted on Edit | Reply
#68
Berfs1
I get the feeling it has something to do with Intel combining the iGPU with the dGPU.
Posted on Reply
#69
ModEl4
Xex360Indeed we will see, but with the amount of used GPUs, I'm confident the prices will go down.
Money was an issue, it seems they focused on hardware where they were competitive and at times superior, and that what kept AMD floating.
As for consoles, nVidia and Intel both failed to capitalise on this, AMD on the other hand succeeded, the way I see it, optimising for consoles is by product AMD's strategy. (Though another by product are the powerless Jaguar cores).
Let's hope you're right, it will be great for all of us.
But I don't see AMD's SRPs going lower than the below assumptions (if street prices are lower, with the same logic Intel's also can have lower street prices in relation with the potential SRPs I quoted before), if AMD's SRPs go lower than the below examples, Intel will potentially have problem (depending on the actual street prices of the competition), being forced to sell too low based on their die sizes:

RX 6750XT $399 SRP
RX 6700XT $359 SRP (if it exists in Q4)
RX 6700. $329 SRP (if it comes to DIY channel)
RX 6650XT $299 SRP
RX 6600XT $279 SRP (if it exists in Q4)
RX 6600 $229 SRP

Regarding consoles, Intel's & Nvidia's decisions was based on the fact that they had the vast majority of the PC market, so they didn't really need these extra console sales (with so low margins i mean) and more importantly their engineers was preoccupied with more important tasks trying to expand/research in other fields, especially Nvidia's (Intel after a while was also preoccupied with dealing with their process problems and optimizing their designs for them lol)
On the contrary AMD from C2D era didn't have anything good to compete in the CPU space except edge cases (in 2013 how much an 8core jaguar design could sell if the refreshed bulldozer 8core had trouble selling?) and in the GPU space although from time to time they had interesting hardware designs their software was inferior to Nvidia's and anyway they never captured more than 1/5th of the annual AiBs GPU market share anyway, so in order to expand in sales and "force" the developers to optimize their games/engines for their architectures, instead of heavily investing (with no money how...) in acquiring talent and closing deals with developers/publishers, they went forward with their console strategy that in the end was absolutely the right move. Now due to how important is backwards compatibility and how more successful is AMD these days in the PC market, they may try to raise margins to MS and Sony, but it would be unwise the least...
Posted on Reply
#70
Unregistered
ModEl4Let's hope you're right, it will be great for all of us.
But I don't see AMD's SRPs going lower than the below assumptions (if street prices are lower, with the same logic Intel's also can have lower street prices in relation with the potential SRPs I quoted before), if AMD's SRPs go lower than the below examples, Intel will potentially have problem (depending on the actual street prices of the competition), being forced to sell too low based on their die sizes:

RX 6750XT $399 SRP
RX 6700XT $359 SRP (if it exists in Q4)
RX 6700. $329 SRP (if it comes to DIY channel)
RX 6650XT $299 SRP
RX 6600XT $279 SRP (if it exists in Q4)
RX 6600 $229 SRP

Regarding consoles, Intel's & Nvidia's decisions was based on the fact that they had the vast majority of the PC market, so they didn't really need these extra console sales (with so low margins i mean) and more importantly their engineers was preoccupied with more important tasks trying to expand/research in other fields, especially Nvidia's (Intel after a while was also preoccupied with dealing with their process problems and optimizing their designs for them lol)
On the contrary AMD from C2D era didn't have anything good to compete in the CPU space except edge cases (in 2013 how much an 8core jaguar design could sell if the refreshed bulldozer 8core had trouble selling?) and in the GPU space although from time to time they had interesting hardware designs their software was inferior to Nvidia's and anyway they never captured more than 1/5th of the annual AiBs GPU market share anyway, so in order to expand in sales and "force" the developers to optimize their games/engines for their architectures, instead of heavily investing (with no money how...) in acquiring talent and closing deals with developers/publishers, they went forward with their console strategy that in the end was absolutely the right move. Now due to how important is backwards compatibility and how more successful is AMD these days in the PC market, they may try to raise margins to MS and Sony, but it would be unwise the least...
Let's hope so, but we can only wait and see.
But I beg to differ regarding consoles, nVidia failed with both Microsoft and Sony, plus ATI had more experience. I would rather put because they failed in this market they were forced out and concentrated on other things.
For Intel I can agree, the market share applies as they dominated the CPU market and if memory serves leaders in the GPU market as well with their iGPUs.
AMD had a lot of interesting designs, while
they didn't go for the fastest GPU all the time, except for the 8800GTX (which was a surprise as they were first with a unified design with the Xenos yet they failed miserably with the 2900XT) and Pascal (a great architecture from nVidia, very efficient and powerful) they had good offering.
Posted on Edit | Reply
#71
chrcoluk
catulitechup

So cute think them dont gain enough selling vcards

:)
Asus are now on EPL advertising boards which needs mega $$$, they alongside international banks and the like.
Posted on Reply
#72
Unregistered
It's Intels first discrete GPU, so however slow it is, it's a start.
At least they kick ass at CPU's
Posted on Edit | Reply
#73
ModEl4
Xex360Let's hope so, but we can only wait and see.
But I beg to differ regarding consoles, nVidia failed with both Microsoft and Sony, plus ATI had more experience. I would rather put because they failed in this market they were forced out and concentrated on other things.
For Intel I can agree, the market share applies as they dominated the CPU market and if memory serves leaders in the GPU market as well with their iGPUs.
AMD had a lot of interesting designs, while
they didn't go for the fastest GPU all the time, except for the 8800GTX (which was a surprise as they were first with a unified design with the Xenos yet they failed miserably with the 2900XT) and Pascal (a great architecture from nVidia, very efficient and powerful) they had good offering.
Nvidia f**ked Microsoft on the original XBOX and Sony also on PS3 on pricing, MS & SONY never went back and regarding MS I have the feeling that the 24bit min mandatory requirement for DX9.0 (that really caused a lot of trouble for Nvidia NV30 performance forcing them to do shader calculations at 32bit instead of 16bit) was a warning, 16bit had some precision issues in some cases causing artifacts but from engineering standpoint 16bit/32bit split probably was the right call for Nvidia for the time period (Q1 2003 with the delay, ATI was first with 9700 regarding DX9.0) after all the roadmap then it was very rapid, by the time DOOM 3 & Half Life 2 launched they already had NV40 out, of course in addition to DX situation the NV30 had other problems also mainly derived from 13nm TSMC process early usage which was immature and on top of that, they clocked too high NV30 in order to compensate for DX 9.0 disadvantage causing extra problems in power consumption/heat etc) and all that was known/forecasted by the time 9700 launched that's why at that time Nvidia introduced the way it's meant to played program, another saga that caused Nvidia trouble back then...
On top of pricing, they were unwilling to customize much because that meant allocating extra resources.
But my reading of the situation wasn't that they f**ked up (well they kinda did lol) they just went to these business deals with half a heart imo.
EDIT: if I remember, Xenos was mainly an artX design, R600 was from another team (David Wang - current SVP of Engineering for RTG was at artX and SGI before, that's why when I saw infinity cache, I thought the dream lives on lol)
Posted on Reply
#74
TheinsanegamerN
Lew ZealandHey this is the first I've heard of (some) drivers being single threaded and may explain a few things I've observed and been annoyed about. I can't change it but I'd like to understand it.

Got any tech links for the curious about this?
Amd/comments/j0c4cp
That leads to this:

www.intel.com/content/www/us/en/developer/articles/technical/performance-methods-and-practices-of-directx-11-multithreaded-rendering.html

The most relevant part:
"By checking the GPU driver support for DirectX 11 multithreaded rendering features (see Figure 7) through the DirectX Caps Viewer, we learn that the NVIDIA GPU driver supports driver command lists, while the AMD GPU driver does not support them. This explains why the driver modules on different GPUs appear in different contexts. When paired with the NVIDIA GPU, working threads can build driver commands in parallel in a deferred context; while when paired with the AMD GPU, the driver commands are all built in serial in the immediate context of the main thread."

From this it's been regarded by most of the tech community that AMD struggles with DX11 and older titles where threading becomes an issue, and it was observed that AMD GPUs tended to run a bit faster on intel CPUs and nvidia better on AMD, as AMD had more threads to work with.
OneMoarNo shit sherlock because amd cpus are a bit slower in cpu bound 1080p tests then the latest from intel with the exception of the 5800X3D
nothing todo with drivers you are testing at 1080p where you are cpu limited the system with the faster cpu/buss will always win
Bruh the a380 is nowhere near fast enough for CPU bound limitations at 1080p. You need 6950xtx or 3090ti tier GPUs to start seeing that. And even fi it WERE fast enough, the differences seen here are far larger then what has been previously observed, and cant be just due to CPU differences.
OneMoarwhy tpu keeps posting this clickbait bullshit targeted at stirring up the uninformed is beyond me
:laugh: :roll: :laugh:
It's dunning kruger in action.
ModEl4Sure prices are coming down, but the last 2 years it seems spoiled Nvidia & AMD, meaning that they surely more than ever will try to "lose" the least amount of money they can.
So the question is how far are they gonna drop really and you seem from your answer optimistic.
Nothing bad about it, we will see.
Regarding AMD got the best guys of the market when bought ATI regarding software engineers my view is not the same, I'm not saying that now the department isn't good, just that it took them 16 years to be (barely?) competitive with Nvidia (and with the whole market optimizing their games/engines for their architectures out of necessity due to consoles)
Oh absolutely. ATi was for sale, in no small part, because their drivers were absolute junk, which led to lower sales even when they had superior hardware, because nvidia back then just worked. ATi OTOH was prone to breaking things and requiring you to shift between multiple different drivers depending on the game you were playing.

Their hardware engineers did a lot with very little, and deserve credit, but their software guys were hilariously behind the competition and it would take AMD 15 years to fully unscrew their software.
Posted on Reply
#75
OneMoar
There is Always Moar
warning extreme anger ahead
TS NOT THE FUCKING DRIVERS you stupid !##sad9!
its the fucking intel 12th gen vs last gen AMD

why are we still here
we are talking about 5-8FPS when tested at 1080p the intel i5 12400 cpu is pretty much YOU GUESSED IT 5-8 FPS faster then a 5600 and thats when tested with a fking 3080
you people are fking idiots and should never post anything ever again because you can't be bothered to stop and spend 60 fucking nanoseconds doing some basic googling and math

also the A380 is barely keeping up with a GTX1650 a fucking entry level card from 3 years ago why the fuck are we not discussing that
Posted on Reply
Add your own comment
Dec 18th, 2024 05:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts