Wednesday, September 28th 2022

Intel Outs Entry-level Arc A310 Desktop Graphics Card with 96 EUs

Intel expanded its Arc "Alchemist" desktop graphics card series with the entry-level Arc A310. This GPU has specs that enable Intel's AIB partners to build low-profile graphics cards that are possibly even single-slot, or conventional sized with fanless cooling. The A310 is being pushed as a slight upgrade over the iGPU, and an alternative to cards such as the AMD Radeon RX 6400. Its target user would want to build a 4K or 8K HTPC, or even be a workstation/HEDT user with a processor that lacks integrated graphics, and wants to use a couple of high-resolution monitors. There is no reference board design, but we expect it to look similar to the Arc Pro A40 in dimensions (pictured below), except with full-size DP and HDMI in place of those mDP connectors, and a full-height bracket out of the box.

The A310 is carved out of the 6 nm "ACM-G11" silicon by enabling 6 out of 8 Xe Cores (that's 96 out of 128 EUs, or 768 out of 1,024 unified shaders). You also get 96 XMX units that accelerate AI; and 6 ray tracing units. The GPU runs at 2.00 GHz, compared to 2.10 GHz on the A380. The memory sub-system has been narrowed by a third—you get 4 GB of 15.5 Gbps GDDR6 memory across a 64-bit wide memory interface. In comparison, the A380 has 6 GB of memory across a 96-bit memory bus. The card features a PCI-Express 4.0 x8 host interface, and with its typical power expected to be well under the 75 W-mark, most custom cards could lack any power connectors.
Add your own comment

35 Comments on Intel Outs Entry-level Arc A310 Desktop Graphics Card with 96 EUs

#26
Bwaze
Vayra86You say jokes aside, but honestly the line you put behind it is even more of a joke: Intel made a product that is economically, completely useless at launch. There is no margin, and if there is, this GPU is far too expensive to make sense :D Its literally e-waste because of the cost of production versus the end performance.
I don't think ARC was meant to come out in such a market with cryptomining crash. For almost two years there was really a shortage of any cards, even the low end ones unsuitable for mining, and even ultra low end old cards were being sold for ridiculous amounts.

And I don't think ARC was meant to be this slow.
Posted on Reply
#27
Vayra86
ModEl4Seems like $99 SRP intention.
This + a 13100F should be faster than a 6core Desktop Rembrandt in gaming (unless AMD changes the CU number from 6 that mobile Ryzen 5 has to 8 in desktop form)
There is no way around it, but still, it takes balls of steel to release a product line up like this and position your obviously more power hungry GPUs that low. They're fighting the bottom half of yesteryear with far worse product and they're not sugar coating it. Wow.
BwazeI don't think ARC was meant to come out in such a market with cryptomining crash. For almost two years there was really a shortage of any cards, even the low end ones unsuitable for mining, and even ultra low end old cards were being sold for ridiculous amounts.

And I don't think ARC was meant to be this slow.
I think it was oversold and that's why it launched as it did. If there was a shortage, ARC would still be slow and sure you could mine with it, but Intel's entry into discrete gaming GPU would have been just as crappy. They literally discovered 'late in the process of development' that their hardware couldn't run DX11 proper, go figure. Speaking of wrong focus or just simply glossing over important aspects.
Posted on Reply
#28
Bwaze
Vayra86I hink it was oversold and that's why it launched as it did. If there was a shortage, ARC would still be slow and sure you could mine with it, but Intel's entry into discrete gaming GPU would have been just as crappy. They literally discovered 'late in the process of development' that their hardware couldn't run DX11 proper, go figure. Speaking of wrong focus or just simply glossing over important aspects.
I still think we might see reviews that purely focus on ARCs positive sides, and downplay the negatives - and even then, it's just about even with Nvidia and AMD. So I think they'll try to play on a "contribute, even if it doesn't make much sense, we need a third player. - for the future" card. That, and brand recognition - heavy discounts on Intel CPU + GPU combos.

And of course Intel can push their cards onto OEM builds, even where it would make much more sense to just use integrated graphics. So even if all the upper end cards are complete rubbish, and they fon't fix their drivers and software, we might see quite a bit of share in sold cards.
Posted on Reply
#29
Valantar
mplayerMuPDFYep, people may be laughing now but this card may be just the right card for these difficult times. Cheap, low power and hardware AV1 decoding and encoding. One day this may make for a fine replacement of my current WX 2100.
That sounds ... like a poor plan. This will likely perform somewhere between a GT 1030 and a GTX 1050. The WX 2100 is smack-dab in the middle of those two. (GTX 1050 is 198% the performance of a 1030 and 153% of a WX 2100). Why would you pay money for a side-grade like that? Is AV1 encoding/decoding worth that much to you?
Posted on Reply
#30
PLAfiller
Too many if's with this card at the moment for me. IMO if:

- it has min. three digital outputs: 2xDP and 1xHDMI
- if it supports at least:


- if the price is right

Then may be it could be a good option for people that don't game, but wanna watch a HEVC movie without their CPU drowning in unicorn blood.
Currently the lowest you can go with that HW decoding capabilities from nvidia is RTX3050, which ain't cheap for me.
Posted on Reply
#31
ModEl4
ValantarThat sounds ... like a poor plan. This will likely perform somewhere between a GT 1030 and a GTX 1050. The WX 2100 is smack-dab in the middle of those two. (GTX 1050 is 198% the performance of a 1030 and 153% of a WX 2100). Why would you pay money for a side-grade like that? Is AV1 encoding/decoding worth that much to you?
With resizable bar on, it will be faster than a GTX 1630 in TPU's 1080p testbed imo.
It's not comparable with the likes of GT 1030.
Vayra86There is no way around it, but still, it takes balls of steel to release a product line up like this and position your obviously more power hungry GPUs that low. They're fighting the bottom half of yesteryear with far worse product and they're not sugar coating it. Wow.
Like i said in the past, i can picture AMD's marketing team wishing Navi33's launch was sooner in order to compare performance, efficiency and die size differences between these 6nm designs.(A770 vs Navi33)
If i remember the design for Battlemage is already finished, logically it would be on N5 and probably will have similar problem regarding efficiency compared with AMD's N5+N6 based Navi32.
As long they are willing to keep the low margins, it will be fine!
I would expect a redesign with Celestial but how successful it will be, is anybody's guess.
Posted on Reply
#32
mplayerMuPDF
ValantarThat sounds ... like a poor plan. This will likely perform somewhere between a GT 1030 and a GTX 1050. The WX 2100 is smack-dab in the middle of those two. (GTX 1050 is 198% the performance of a 1030 and 153% of a WX 2100). Why would you pay money for a side-grade like that? Is AV1 encoding/decoding worth that much to you?
Why is it a bad plan? I rarely game anymore so that has about zero percent priority for me. I am familiar with the performance of the GTX 1050 as I used to have one actually (sold it with my ThinkCentre M91p MT; it was my first dGPU) My WX 2100 is about 2 years old now (well, I bought it "open box" on eBay so there is no way to know for sure if it was actually new in 2020). Replacing it in 2 years would be reasonable. If I can still sell it for at least $30 or something then and get the A310 for less than $100 then I think it would be a decent deal. Performance will improve over time, especially on Linux (performance improvements were made recently even for the driver of the Northern Islands iGPUs in my laptops). I do think that having AV1 encoding/decoding is very valuable, maybe not $70 but the A310 will have some other advantages too undoubtedly (I think it will perform better for compute at least than the WX 2100), AV1 is becoming the new standard and it will stick around for a long, long time almost certainly. It does not have the patent mess of H.264 and H.265 and it saves a ton of space. I could let the A310 reencode lots of H.264 videos where ultimate quality is not important and save a ton of space (and therefore $). Now, I could get the A380 instead but I don't like that many cards need a power connector (and I don't want the factory OC anyway) and it is more expensive. Nvidia is not an option for me on Linux and too expensive and the RX 6400 would considerably more expensive than the A310 too and completely lacks any kind of encode.

I just hope that they will at least keep selling the two low-end cards (A310 and A380) for a long time (like the GT 1030), so us cashstrapped consumers can get them new at a good price, regardless of the broader success of the Arc series.
Posted on Reply
#33
dragontamer5788
At the right price, the A310 is good. It is a real dedicated graphics card at very low power-consumption (PCIe only). Its single-slot and probably going to be low-profile for SFF builds. There's a use case for that.

But it has to come in at the right price point. We're talking like $75 or so, or maybe even less than that. Anything is better than iGPU, because iGPU shares bandwidth with the CPU. So simply getting a card with dedicated GDDR6 RAM will help out significantly at 1080p gaming.
Posted on Reply
#34
Valantar
mplayerMuPDFWhy is it a bad plan? I rarely game anymore so that has about zero percent priority for me. I am familiar with the performance of the GTX 1050 as I used to have one actually (sold it with my ThinkCentre M91p MT; it was my first dGPU) My WX 2100 is about 2 years old now (well, I bought it "open box" on eBay so there is no way to know for sure if it was actually new in 2020). Replacing it in 2 years would be reasonable. If I can still sell it for at least $30 or something then and get the A310 for less than $100 then I think it would be a decent deal. Performance will improve over time, especially on Linux (performance improvements were made recently even for the driver of the Northern Islands iGPUs in my laptops). I do think that having AV1 encoding/decoding is very valuable, maybe not $70 but the A310 will have some other advantages too undoubtedly (I think it will perform better for compute at least than the WX 2100), AV1 is becoming the new standard and it will stick around for a long, long time almost certainly. It does not have the patent mess of H.264 and H.265 and it saves a ton of space. I could let the A310 reencode lots of H.264 videos where ultimate quality is not important and save a ton of space (and therefore $). Now, I could get the A380 instead but I don't like that many cards need a power connector (and I don't want the factory OC anyway) and it is more expensive. Nvidia is not an option for me on Linux and too expensive and the RX 6400 would considerably more expensive than the A310 too and completely lacks any kind of encode.

I just hope that they will at least keep selling the two low-end cards (A310 and A380) for a long time (like the GT 1030), so us cashstrapped consumers can get them new at a good price, regardless of the broader success of the Arc series.
This is precisely why I said it seems like a poor plan - a performance sidegrade with encoding being the only real gain, which you yourself say isn't worth $70, plus theoretical future performance gains? That doesn't add up to something legitimizing a purchase to me, when you have something that seems to be working perfectly fine for what you need. It should indeed be better for compute, assuming Intel can get their drivers even moderately usable - it has ~2x the FP32 resoureces, after all. But knowing the state of Intel's Windows drivers, I wouldn't trust them to deliver decent compute support even there, let alone in Linux.
Posted on Reply
#35
mplayerMuPDF
ValantarThis is precisely why I said it seems like a poor plan - a performance sidegrade with encoding being the only real gain, which you yourself say isn't worth $70, plus theoretical future performance gains? That doesn't add up to something legitimizing a purchase to me, when you have something that seems to be working perfectly fine for what you need. It should indeed be better for compute, assuming Intel can get their drivers even moderately usable - it has ~2x the FP32 resoureces, after all. But knowing the state of Intel's Windows drivers, I wouldn't trust them to deliver decent compute support even there, let alone in Linux.
Well, I guess as long as the WX 2100 is working it won't be a huge priority but if it dies then it will be a good replacement IMO. I think it is more attractive than an RX6400 for me though. Their compute support on Linux will almost certainly be better than AMD's. AMD has never fulfilled their promises (since Richland and I actually have one Richland laptop if you check my sig) of OpenCL acceleration basically making up for one Bulldozer-derivative module only having one FPU... But first I have to replace my 8-year-old monitor and HDD anyway and I want a new case because I am very dissatisfied with my current one.
Posted on Reply
Add your own comment
May 1st, 2025 15:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts