Finally! Intel Arc discrete graphics is here making Team Blue the third serious player in the highly competitive graphics card market today. Technically this is Intel's second graphics architecture in recent years after the
Intel DG1 in 2021, although that never saw a real market release. The Arc A380 in this review isn't shaping up to be a significant release either. Intel is not sampling this card to the press, and it's only available for purchase in China—probably because Intel thinks the Chinese buyers aren't as critical of their purchases as the ones in the West. I also feel that Intel was under pressure from shareholders to finally start shipping their first graphics card to justify the billions spent in research and development. Personally my thoughts are that trying to hide the product from reviewers is the wrong strategy, it just makes things more complicated for everyone involved, including Intel, because they have no way to control things and (try to) influence the narrative in addition to inducing doubts in everyone's minds as to what is wrong with the product.
After reading the initial reviews of the Arc A380, I was highly disappointed just like everyone else no doubt. One of the most surprising outcomes was that "it only works on Intel and only works with resizable BAR". So when the card came in, I kept an open mind, plopped it into my Zen 3 Ryzen test system right away and, surprise, it worked perfectly fine. Well, not
perfectly fine, because there are numerous layers of driver bugs to deal with, but the card performed as expected with Resizable BAR enabled on the AMD platform. No special tweaks, BIOS updates, or settings were necessary as I simply inserted the card, powered up the machine, and its BAR size was 6 GB—the exact same mechanics as with AMD and NVIDIA—it just works. To confirm that Resizable BAR is active and working I turned it off in the BIOS and 3D performance was affected indeed, quite dramatically in some games but more on that later.
To say that not everything is smooth sailing with Intel Arc at this time would be an understatement. The official "stable" 101.1743 driver from Intel's website is everything but! I encountered numerous bugs including bluescreens, corrupted desktop after startup, random systems hangs, system getting stuck during shutdown sequences, and more—very frustrating issues that made me ask myself "WTF? Has anybody even tested this driver once?". A couple of times I had to replace the Intel card with an NVIDIA card so that Windows would actually boot—because the Intel driver can no longer load and that kept breaking things—so I could manually wipe the Intel driver and pray that a new driver install would work better. In order to work around some other issues, I had to connect through Remote Desktop because the Arc card wouldn't send any output to my monitor (most often when no driver installed). When games did run, they usually ran well enough and I didn't encounter any visual rendering errors or crashes except in Divinity Original Sin 2 (screen corruption), Witcher 3 (crash on startup), and Red Dead Redemption 2 (crash on startup). These issues with the latter two are especially surprising given how popular they are/were, which is the reason why they are included in my test suite. All other games ran fine, and admittedly this is a huge achievement for such a new product/architecture.
I then tried Intel's newest 101.3220 Beta driver and it magically solved all serious issues. The card would now work and be 100% stable in all games, including Witcher, DoS 2 and RDR2—looks like the driver team is actually fixing bugs, good job! Just to emphasize, as soon as I switched to the beta drivers there was not a single crash during all my testing—a huge improvement. That's why the results from the beta driver are the green "default result" bar in all charts in this review. The performance data from the 101.1743 driver is included too, and it shows that the newer beta driver does have slightly worse performance in some games along with a small performance hit across the board, which could be the result of some debugging or logging code on this product that still does not seem ready for the market.
Averaged over the 25 games in our test suite at 1080p resolution, the Intel Arc A380 runs slightly faster than the AMD Radeon RX 6400 and matches the NVIDIA GeForce GTX 1650. Compared to the GeForce GTX 1630 that we tested not long ago, the average performance uplift is a mighty 63%, and on the flip side we see the AMD Radeon RX 6500 XT being 23% faster than the A380. Some highly popular graphics cards from the past, including the RX 570 and GTX 1060, are faster than the Intel card too, by 13% and 24%, respectively. This means that, just like GTX 1630, RX 6400 and GTX 1650, the Intel Arc A380 is not suitable for a smooth 60 FPS Full HD gaming experience at maximum settings in 2022. This metric is within reach for cards such as the RTX 3050, RTX 2060, RX 6600, which are more than twice (!) as fast as Intel's A380—at a higher price, of course.
Just as I did for other recent reviews of lower-end graphics cards, I performed an additional run of my whole test suite at 1080p with the lowest possible settings in all the games to get a feel of the minimum baseline that you can achieve. Here the Arc A380 doesn't do so bad, reaching 60 FPS or higher in 17 of our 25 game titles—albeit with serious compromises in image quality. Having an upscaling solution such as DLSS or FSR would come in very handy here! Intel has announced it's own XeSS upscaling tech, however it's practically vaporware at this time. Originally, Dolmen was supposed to be the first game with XeSS support, but that never materialized and is now on the "will be added at a later time" status. AMD's FSR algorithm works on all modern graphics cards including Intel Arc, and so, ironically enough, widespread support for AMD FSR will be great for Intel GPU owners.
On the plus side, Intel Arc A380 has full support for hardware-accelerated ray tracing. I ran our whole battery of tests and can confirm that ray tracing works flawlessly on the A380 too. I couldn't spot any rendering errors, crashes, or any other issues. What's also worth mentioning is that the performance loss is much smaller compared to the AMD platform since Intel executes more operations in its hardware units. Compared to NVIDIA's 2nd gen Ampere architecture, the performance hit a little bit bigger but still surprisingly close. Still, ray tracing is an irrelevant feature on an entry-level card such as the A380 (and all direct competitors), because the overall performance is simply too low—no reason to enable RT when you're struggling to reach 60 FPS at lowest details. It's still an interesting data point with promising results that bodes well for the architecture though, because Intel will be releasing higher-end graphics cards very soon.
If you've followed our tech news posts in recent weeks, then you'll certainly have read about the Resizable BAR requirement for Intel Arc. Intel's reviewer's guide states "[...] must be tested on a platform with the PCI Express Resizable BAR feature enabled for the best and most representative picture of performance". That's why I performed one more test run with the latest beta drivers, but with Resizable BAR disabled. While the average FPS results are quite a bit lower—18% down on average—they don't paint the whole picture. If you hop over to our frametimes section, you can see that there are huge frametime spikes in many games. This includes whopping 1-second freezes in Forza or Guardians of the Galaxy, and other games show noticeable stuttering too. With Resizable BAR enabled, the frametime stability is comparable to similar cards from AMD and NVIDIA. It clearly appears that Intel has designed their Alchemist architecture with Resizable BAR in mind. This is a capability of the PCIe standard that enables large memory transfers between the CPU and GPU that otherwise would be broken into multiple chunks of 256 MB that are serialized and would be slower to execute, which in turn could cause render stalls while the GPU is waiting for all the data to come in. Obviously this can be mitigated with hardware and driver work else older graphics cards from NVIDIA and AMD would show similar issues. Right now there are many systems without Resizable BAR, so Intel's limitation will be a dealbreaker for many. But considering that this year's Arc releases are just first steps on the long road to graphics card leadership, I can see why Intel decided to focus on forward looking technology and spend their resources on improving other features. The good news is that Resizable BAR on Intel Arc works great on AMD Ryzen—this review is proof of it.
Gunnir—a company unknown to most gamers in the West—was picked by Intel to be the exclusive launch partner for the A380 in China, which happens to be the first and only country (so far) where the card is available for purchase. While tastes differ, I feel like Gunnir came up with a nice calm visual design language that is pleasing to the eye. It definitely didn't go overboard with bling, and the "Intel Blue" colors are surprisingly subtle. Thermals of the dual-slot, dual-fan design are excellent—under full load the card reaches only 59 °C, which is laughably good for a modern graphics card. Noise levels on the other hand are quite high at 38 dBA, especially considering the entry-level positioning of the card. Allowing higher, but still "good", temperatures could have resulted in an ultra-quiet card. What's great is that the card will turn off its fans during idle desktop work and internet browsing just like recent releases from other GPU vendors. The fan control algorithms could do with some refinements though. For example, at idle I noticed the fans kept spinning up and down every few seconds. This was because the GPU temperature would 50 °C to get the fans to start spinning, and then the GPU temperature would inevitable go under 50 °C causing the fans to stop. Then temps would increase again and the cycle continues—another case of "and nobody noticed this?" .. just like the missing letter in "unknown" on the package front. That said, I really like that a metal backplate is included here and the PCB design of the card is pretty decent too—certainly better than what AMD and NVIDIA are giving us at this price point. In particular, the GPU VRM controller is of surprisingly high quality. What's not so optimal is that the VRMs aren't cooled by the heatsink, and there's almost no airflow reaching them. Given the low-power nature of this design it's not a big issue, but for the high-end cards such things need to be improved.
Intel contracted TSMC to fabricate their new GPU, because TSMC is the only company in the world that has a currently working 6 nanometer production process for desktop-class processors. Such a small process promises great energy efficiency, and the A380 is competitive in terms of performance per watt by being roughly in the middle of our test group. It's better than AMD Navi 1x and NVIDIA Turing, although it comes slightly behind Ampere and AMD Navi 2x is up to 36% more efficient. This hints that the Intel chip design using a brand-new 6 nm process can barely compete with the older process nodes used by AMD and NVIDIA, but you have to consider that AMD and NVIDIA have spent over a decade and billions of dollars optimizing their architectures for energy efficiency given that is the only way to scale up now.
Our power consumption testing reveals that the power draw numbers reported by the driver (in Arc Control), and in 3rd party monitoring software such as GPU-Z and HWiNFO seem to be "GPU chip only" power as opposed to the "whole card". For example, when the software reports 54 W, the actual measured card-only power draw is closer to 80 W. Unlike NVIDIA which puts physical monitoring chips on their boards that can measure the real power draw, it seems Intel follows the AMD approach of reporting chip-only power consumption metrics. In Furmark we measured power draws up to 94 W with transient peaks reaching 101 W! Good thing then that there's an additional PCIe power connector as the 75 W slot-only power wouldn't be enough for the card. There no need for the 8-pin PCIe connector though; a 6-pin would have sufficed.
While overclocking worked, I have to reiterate I'm shocked by the current state of the "Arc Control" software control panel. There are almost as many clearly obvious bugs as there are working controls to click on; the software feels like an undergrad class project. No doubt Intel will improve it a lot, but right now even the most basic refinement is missing. What I have to commend Intel for is that there is no registration required unlike NVIDIA with its GeForce Experience. With manual GPU overclocking, we were able to increase the GPU frequency by 10% (actual measured frequency in-application), which resulted in a 5% performance improvement. Not great, not terrible. Contrary to some early reports, overclocking doesn't magically make the card a lot faster. It instead behaves exactly as any other graphics card you are familiar with in that performance is gained until the card becomes unstable at some point and crashes—no surprises here.
Intel's official MSRP for the Arc A380 is 1030 Yuan, or USD 150. The Gunnir A380 currently sells for the equivalent of $190 which is considerably higher than that number. For our price/performance comparisons, I plotted both these values in addition to various other theoretical prices so we may get a better feel of what a realistic market price for the A380 would look like. At $190, there's absolutely no way this card makes any sense since you can get better performance—and a stable experience—from a $120 Radeon RX 570, or spend $210 for the RX 5600 XT which is twice as fast. At Intel's official $150 price point, things get more interesting as the price/performance ratio is now better than that of the AMD RX 6400 and the NVIDIA GTX 1650, but still not enough to catch the RX 6500 XT, the GTX 1650 Super, or the RX 580. While I'm sure that $100 is an unrealistic price, this is really the point at which the Arc A380 becomes the graphics card with the best value on the market. It is only here that I could imagine recommending the card to a tech-savvy audience who are willing to work through the current technical issues. On the other hand, you could simply grab an RX 570 for $120 which costs $20 more, but is 13% faster and with mature software that won't be frustrating in daily use.
Does this mean Intel has failed with its GPU project, should scrap all future development, and replace everyone? On the contrary, I think the Arc A380 is a very significant achievement for a young team of promising minds and talented industry people, many of whom I've known for years as they got together to embark on a fantastic odyssey. As mentioned before, Intel is releasing higher-end versions of Arc Alchemist soon, with the A770 going to have 4x the GPU cores as the A380. If the performance scaling is even 2.5-3.0x, then it would be a viable option for the midrange market and affect sales of AMD and NVIDIA significantly—especially once Intel's software matures, which seems to be only a question of time given its near-infinite resources dedicated here. Also, let's not forget that Intel is on a 1-year cadence with their GPU releases, so next year we'll see their next generation titled Arc Battlemage.