Thursday, June 23rd 2022
Intel's Arc A380 Performs Even Worse With an AMD CPU
According to fresh benchmark numbers from someone on bilibili, Intel's Arc A380 cards perform even worse when paired with an AMD CPU compared to when paired with an Intel CPU. The card was tested using an AMD Ryzen 5 5600 on an ASUS TUF B550M motherboard paired with 16 GB of DDR4 3600 MHz memory. The Intel system it was tested against consisted of a Core i5-12400 on an ASUS TUF B660M motherboard with the same type of memory. Both test systems had resizable BAR support set to auto and above 4G decoding enabled. Windows 11 21H2 was also installed on both systems.
In every single game out of the 10 games tested, except League of Legends, the AMD system was behind the Intel system by anything from a mere one percent to as much as 15 percent. The worst performance disadvantage was in Forza Horizon 5 and Total War Three Kingdoms, both were 14 to 15 percent behind. The games that were tested, in order of the graph below are: League of Legends, Dota 2, Rainbow 6 Extraction, Watch Dogs Legions, Far Cry 6, Assassin's Creed Valhalla, Total War Three Kingdoms, Shadow of the Tomb Raider, CS:GO and Forza Horizon 5. For comparison, an NVIDIA GeForce GTX 1650 was also used, but only tested on the Intel based system and the Arc A380 only beat it on Total War Three Kingdoms, albeit by a seven percent margin. It appears Intel has a lot of work to do when it comes to its drivers, but at last right now, mixing Intel Arc graphics cards and AMD processors seems to be a bad idea.
Sources:
billibilli, via @greymon55
In every single game out of the 10 games tested, except League of Legends, the AMD system was behind the Intel system by anything from a mere one percent to as much as 15 percent. The worst performance disadvantage was in Forza Horizon 5 and Total War Three Kingdoms, both were 14 to 15 percent behind. The games that were tested, in order of the graph below are: League of Legends, Dota 2, Rainbow 6 Extraction, Watch Dogs Legions, Far Cry 6, Assassin's Creed Valhalla, Total War Three Kingdoms, Shadow of the Tomb Raider, CS:GO and Forza Horizon 5. For comparison, an NVIDIA GeForce GTX 1650 was also used, but only tested on the Intel based system and the Arc A380 only beat it on Total War Three Kingdoms, albeit by a seven percent margin. It appears Intel has a lot of work to do when it comes to its drivers, but at last right now, mixing Intel Arc graphics cards and AMD processors seems to be a bad idea.
77 Comments on Intel's Arc A380 Performs Even Worse With an AMD CPU
trash is trash .. the end
Launch day pricing was 3999 RMB, but the cards are supposed to retail for around 1200-1300 RMB.
So the question is how far are they gonna drop really and you seem from your answer optimistic.
Nothing bad about it, we will see.
Regarding AMD got the best guys of the market when bought ATI regarding software engineers my view is not the same, I'm not saying that now the department isn't good, just that it took them 16 years to be (barely?) competitive with Nvidia (and with the whole market optimizing their games/engines for their architectures out of necessity due to consoles)
Sure you can have your view but you seemed kinda certain in the answer you gave.
isn't the usual practice of Intel doing business through rebates, MDF, etc and it's too early, we don't even have the international launch yet.
When AMD/Nvidia/Intel announce a SRP, they take account the retailer/e-tailer profit (no matter how small), the distributor profit (no matter how small) and the vendor profit and let me tell you it isn't so small in Nvidia's and AMD's case. (same certainty also, jk)
Regarding A380 street prices, it's about supply and demand in the end so I wouldn't rush a judgment regarding partners profits based on that and in any case what's the logic, they have to sell 3X the SRP in order to make any money?
Also AMD's 2022 SRPs is a joke.
$550 for 6750XT, $400 for 6650XT, $200 for 6500XT, how are they gonna sell at these prices when Nvidia's prices finally come down to SRPs depending on the model?
Even now when Nvidia's prices hasn't come to SRPs level, AMD is forced to sell lower than SRPs (in some European countries at least):
6750XT is selling at 3070 level although $50 more expensive, 6700XT is selling nearly 100€ less than 3070 although only $20 less expensive, 6600 is selling at 3050 level despite having $80 more SRP etc.
AMD's "starting price" when dealing with vendors is based on announced SRPs level minus the calculated mentioned above margins and after negotiation based on current market condition, quantity, sales record, etc they reach to an agreement, so AMD's partners they have to fight a lot harder than before in a negotiation with those high SRPs in order to maintain their margins, and if they have to sell for example -15% or more from SRP some models due to market condition (RX6500XT, RX6600 for example), depending on the deal for those particular models the profits i imagine would be very slim.
Intel doesn't seem to have this SRP problem with 880 Yuan+VAT SRP but we will see.
In any case my post was about potential SRPs, not what street prices will end up, I wouldn't know anyway.
Got any tech links for the curious about this?
So cute think them dont gain enough selling vcards
:)
nothing todo with drivers you are testing at 1080p where you are cpu limited the system with the faster cpu/buss will always win
why tpu keeps posting this clickbait bullshit targeted at stirring up the uninformed is beyond me
Money was an issue, it seems they focused on hardware where they were competitive and at times superior, and that what kept AMD floating.
As for consoles, nVidia and Intel both failed to capitalise on this, AMD on the other hand succeeded, the way I see it, optimising for consoles is by product AMD's strategy. (Though another by product are the powerless Jaguar cores).
But I don't see AMD's SRPs going lower than the below assumptions (if street prices are lower, with the same logic Intel's also can have lower street prices in relation with the potential SRPs I quoted before), if AMD's SRPs go lower than the below examples, Intel will potentially have problem (depending on the actual street prices of the competition), being forced to sell too low based on their die sizes:
RX 6750XT $399 SRP
RX 6700XT $359 SRP (if it exists in Q4)
RX 6700. $329 SRP (if it comes to DIY channel)
RX 6650XT $299 SRP
RX 6600XT $279 SRP (if it exists in Q4)
RX 6600 $229 SRP
Regarding consoles, Intel's & Nvidia's decisions was based on the fact that they had the vast majority of the PC market, so they didn't really need these extra console sales (with so low margins i mean) and more importantly their engineers was preoccupied with more important tasks trying to expand/research in other fields, especially Nvidia's (Intel after a while was also preoccupied with dealing with their process problems and optimizing their designs for them lol)
On the contrary AMD from C2D era didn't have anything good to compete in the CPU space except edge cases (in 2013 how much an 8core jaguar design could sell if the refreshed bulldozer 8core had trouble selling?) and in the GPU space although from time to time they had interesting hardware designs their software was inferior to Nvidia's and anyway they never captured more than 1/5th of the annual AiBs GPU market share anyway, so in order to expand in sales and "force" the developers to optimize their games/engines for their architectures, instead of heavily investing (with no money how...) in acquiring talent and closing deals with developers/publishers, they went forward with their console strategy that in the end was absolutely the right move. Now due to how important is backwards compatibility and how more successful is AMD these days in the PC market, they may try to raise margins to MS and Sony, but it would be unwise the least...
But I beg to differ regarding consoles, nVidia failed with both Microsoft and Sony, plus ATI had more experience. I would rather put because they failed in this market they were forced out and concentrated on other things.
For Intel I can agree, the market share applies as they dominated the CPU market and if memory serves leaders in the GPU market as well with their iGPUs.
AMD had a lot of interesting designs, while
they didn't go for the fastest GPU all the time, except for the 8800GTX (which was a surprise as they were first with a unified design with the Xenos yet they failed miserably with the 2900XT) and Pascal (a great architecture from nVidia, very efficient and powerful) they had good offering.
At least they kick ass at CPU's
On top of pricing, they were unwilling to customize much because that meant allocating extra resources.
But my reading of the situation wasn't that they f**ked up (well they kinda did lol) they just went to these business deals with half a heart imo.
EDIT: if I remember, Xenos was mainly an artX design, R600 was from another team (David Wang - current SVP of Engineering for RTG was at artX and SGI before, that's why when I saw infinity cache, I thought the dream lives on lol)
That leads to this:
www.intel.com/content/www/us/en/developer/articles/technical/performance-methods-and-practices-of-directx-11-multithreaded-rendering.html
The most relevant part:
"By checking the GPU driver support for DirectX 11 multithreaded rendering features (see Figure 7) through the DirectX Caps Viewer, we learn that the NVIDIA GPU driver supports driver command lists, while the AMD GPU driver does not support them. This explains why the driver modules on different GPUs appear in different contexts. When paired with the NVIDIA GPU, working threads can build driver commands in parallel in a deferred context; while when paired with the AMD GPU, the driver commands are all built in serial in the immediate context of the main thread."
From this it's been regarded by most of the tech community that AMD struggles with DX11 and older titles where threading becomes an issue, and it was observed that AMD GPUs tended to run a bit faster on intel CPUs and nvidia better on AMD, as AMD had more threads to work with. Bruh the a380 is nowhere near fast enough for CPU bound limitations at 1080p. You need 6950xtx or 3090ti tier GPUs to start seeing that. And even fi it WERE fast enough, the differences seen here are far larger then what has been previously observed, and cant be just due to CPU differences. :laugh: :roll: :laugh:
It's dunning kruger in action. Oh absolutely. ATi was for sale, in no small part, because their drivers were absolute junk, which led to lower sales even when they had superior hardware, because nvidia back then just worked. ATi OTOH was prone to breaking things and requiring you to shift between multiple different drivers depending on the game you were playing.
Their hardware engineers did a lot with very little, and deserve credit, but their software guys were hilariously behind the competition and it would take AMD 15 years to fully unscrew their software.
also the A380 is barely keeping up with a GTX1650 a fucking entry level card from 3 years ago why the fuck are we not discussing that