• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Outs Entry-level Arc A310 Desktop Graphics Card with 96 EUs

Joined
Jun 2, 2022
Messages
349 (0.39/day)
System Name HP EliteBook 725 G3
Processor AMD PRO A10-8700B (1.8 GHz CMT dual module with 3.2 GHz boost)
Motherboard HP proprietary
Cooling pretty good
Memory 8 GB SK Hynix DDR3 SODIMM
Video Card(s) Radeon R6 (Carrizo/GCNv3)
Storage internal Kioxia XG6 1 TB NVMe SSD (aftermarket)
Display(s) HP P22h G4 21.5" 1080p (& 768p internal LCD)
Case HP proprietary metal case
Audio Device(s) built-in Conexant CX20724 HDA chipset -> Roland RH-200S
Power Supply HP-branded AC adapter
Mouse Steelseries Rival 310
Keyboard Cherry G84-5200
Software Alma Linux 9.1
Benchmark Scores Broadcom BCM94356 11ac M.2 WiFi card (aftermarket)
I look forward to seeing how much these cards will cost here in Australia. They could be nice replacements for my 2 aging HD7870 GHz edition cards from about a decade ago that are still in perfect working order for web browsing today.
Yep, people may be laughing now but this card may be just the right card for these difficult times. Cheap, low power and hardware AV1 decoding and encoding. One day this may make for a fine replacement of my current WX 2100.
 
Joined
May 11, 2018
Messages
1,253 (0.52/day)
You say jokes aside, but honestly the line you put behind it is even more of a joke: Intel made a product that is economically, completely useless at launch. There is no margin, and if there is, this GPU is far too expensive to make sense :D Its literally e-waste because of the cost of production versus the end performance.

I don't think ARC was meant to come out in such a market with cryptomining crash. For almost two years there was really a shortage of any cards, even the low end ones unsuitable for mining, and even ultra low end old cards were being sold for ridiculous amounts.

And I don't think ARC was meant to be this slow.
 
Joined
Sep 17, 2014
Messages
22,431 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Seems like $99 SRP intention.
This + a 13100F should be faster than a 6core Desktop Rembrandt in gaming (unless AMD changes the CU number from 6 that mobile Ryzen 5 has to 8 in desktop form)
There is no way around it, but still, it takes balls of steel to release a product line up like this and position your obviously more power hungry GPUs that low. They're fighting the bottom half of yesteryear with far worse product and they're not sugar coating it. Wow.

I don't think ARC was meant to come out in such a market with cryptomining crash. For almost two years there was really a shortage of any cards, even the low end ones unsuitable for mining, and even ultra low end old cards were being sold for ridiculous amounts.

And I don't think ARC was meant to be this slow.
I think it was oversold and that's why it launched as it did. If there was a shortage, ARC would still be slow and sure you could mine with it, but Intel's entry into discrete gaming GPU would have been just as crappy. They literally discovered 'late in the process of development' that their hardware couldn't run DX11 proper, go figure. Speaking of wrong focus or just simply glossing over important aspects.
 
Joined
May 11, 2018
Messages
1,253 (0.52/day)
I hink it was oversold and that's why it launched as it did. If there was a shortage, ARC would still be slow and sure you could mine with it, but Intel's entry into discrete gaming GPU would have been just as crappy. They literally discovered 'late in the process of development' that their hardware couldn't run DX11 proper, go figure. Speaking of wrong focus or just simply glossing over important aspects.

I still think we might see reviews that purely focus on ARCs positive sides, and downplay the negatives - and even then, it's just about even with Nvidia and AMD. So I think they'll try to play on a "contribute, even if it doesn't make much sense, we need a third player. - for the future" card. That, and brand recognition - heavy discounts on Intel CPU + GPU combos.

And of course Intel can push their cards onto OEM builds, even where it would make much more sense to just use integrated graphics. So even if all the upper end cards are complete rubbish, and they fon't fix their drivers and software, we might see quite a bit of share in sold cards.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Yep, people may be laughing now but this card may be just the right card for these difficult times. Cheap, low power and hardware AV1 decoding and encoding. One day this may make for a fine replacement of my current WX 2100.
That sounds ... like a poor plan. This will likely perform somewhere between a GT 1030 and a GTX 1050. The WX 2100 is smack-dab in the middle of those two. (GTX 1050 is 198% the performance of a 1030 and 153% of a WX 2100). Why would you pay money for a side-grade like that? Is AV1 encoding/decoding worth that much to you?
 
Joined
Jan 31, 2012
Messages
2,639 (0.56/day)
Location
East Europe
System Name PLAHI
Processor I5-10400
Motherboard MSI MPG Z490 GAMING PLUS
Cooling 120 AIO
Memory 32GB Corsair LPX 2400 Mhz DDR4 CL14
Video Card(s) PNY QUADRO RTX A2000
Storage Intel 670P 512GB
Display(s) Philips 288E2A 28" 4K + 22" LG 1080p
Case Silverstone Raven 03 (RV03)
Audio Device(s) Creative Soundblaster Z
Power Supply Fractal Design IntegraM 650W
Mouse Logitech Triathlon
Keyboard REDRAGON MITRA
Software Windows 11 Home x 64
Too many if's with this card at the moment for me. IMO if:

- it has min. three digital outputs: 2xDP and 1xHDMI
- if it supports at least:

1664447310348.png

- if the price is right

Then may be it could be a good option for people that don't game, but wanna watch a HEVC movie without their CPU drowning in unicorn blood.
Currently the lowest you can go with that HW decoding capabilities from nvidia is RTX3050, which ain't cheap for me.
 
Joined
Oct 27, 2020
Messages
791 (0.53/day)
That sounds ... like a poor plan. This will likely perform somewhere between a GT 1030 and a GTX 1050. The WX 2100 is smack-dab in the middle of those two. (GTX 1050 is 198% the performance of a 1030 and 153% of a WX 2100). Why would you pay money for a side-grade like that? Is AV1 encoding/decoding worth that much to you?
With resizable bar on, it will be faster than a GTX 1630 in TPU's 1080p testbed imo.
It's not comparable with the likes of GT 1030.


There is no way around it, but still, it takes balls of steel to release a product line up like this and position your obviously more power hungry GPUs that low. They're fighting the bottom half of yesteryear with far worse product and they're not sugar coating it. Wow.
Like i said in the past, i can picture AMD's marketing team wishing Navi33's launch was sooner in order to compare performance, efficiency and die size differences between these 6nm designs.(A770 vs Navi33)
If i remember the design for Battlemage is already finished, logically it would be on N5 and probably will have similar problem regarding efficiency compared with AMD's N5+N6 based Navi32.
As long they are willing to keep the low margins, it will be fine!
I would expect a redesign with Celestial but how successful it will be, is anybody's guess.
 
Last edited:
Joined
Jun 2, 2022
Messages
349 (0.39/day)
System Name HP EliteBook 725 G3
Processor AMD PRO A10-8700B (1.8 GHz CMT dual module with 3.2 GHz boost)
Motherboard HP proprietary
Cooling pretty good
Memory 8 GB SK Hynix DDR3 SODIMM
Video Card(s) Radeon R6 (Carrizo/GCNv3)
Storage internal Kioxia XG6 1 TB NVMe SSD (aftermarket)
Display(s) HP P22h G4 21.5" 1080p (& 768p internal LCD)
Case HP proprietary metal case
Audio Device(s) built-in Conexant CX20724 HDA chipset -> Roland RH-200S
Power Supply HP-branded AC adapter
Mouse Steelseries Rival 310
Keyboard Cherry G84-5200
Software Alma Linux 9.1
Benchmark Scores Broadcom BCM94356 11ac M.2 WiFi card (aftermarket)
That sounds ... like a poor plan. This will likely perform somewhere between a GT 1030 and a GTX 1050. The WX 2100 is smack-dab in the middle of those two. (GTX 1050 is 198% the performance of a 1030 and 153% of a WX 2100). Why would you pay money for a side-grade like that? Is AV1 encoding/decoding worth that much to you?
Why is it a bad plan? I rarely game anymore so that has about zero percent priority for me. I am familiar with the performance of the GTX 1050 as I used to have one actually (sold it with my ThinkCentre M91p MT; it was my first dGPU) My WX 2100 is about 2 years old now (well, I bought it "open box" on eBay so there is no way to know for sure if it was actually new in 2020). Replacing it in 2 years would be reasonable. If I can still sell it for at least $30 or something then and get the A310 for less than $100 then I think it would be a decent deal. Performance will improve over time, especially on Linux (performance improvements were made recently even for the driver of the Northern Islands iGPUs in my laptops). I do think that having AV1 encoding/decoding is very valuable, maybe not $70 but the A310 will have some other advantages too undoubtedly (I think it will perform better for compute at least than the WX 2100), AV1 is becoming the new standard and it will stick around for a long, long time almost certainly. It does not have the patent mess of H.264 and H.265 and it saves a ton of space. I could let the A310 reencode lots of H.264 videos where ultimate quality is not important and save a ton of space (and therefore $). Now, I could get the A380 instead but I don't like that many cards need a power connector (and I don't want the factory OC anyway) and it is more expensive. Nvidia is not an option for me on Linux and too expensive and the RX 6400 would considerably more expensive than the A310 too and completely lacks any kind of encode.

I just hope that they will at least keep selling the two low-end cards (A310 and A380) for a long time (like the GT 1030), so us cashstrapped consumers can get them new at a good price, regardless of the broader success of the Arc series.
 
Joined
Apr 24, 2020
Messages
2,709 (1.62/day)
At the right price, the A310 is good. It is a real dedicated graphics card at very low power-consumption (PCIe only). Its single-slot and probably going to be low-profile for SFF builds. There's a use case for that.

But it has to come in at the right price point. We're talking like $75 or so, or maybe even less than that. Anything is better than iGPU, because iGPU shares bandwidth with the CPU. So simply getting a card with dedicated GDDR6 RAM will help out significantly at 1080p gaming.
 
Joined
May 2, 2017
Messages
7,762 (2.81/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Why is it a bad plan? I rarely game anymore so that has about zero percent priority for me. I am familiar with the performance of the GTX 1050 as I used to have one actually (sold it with my ThinkCentre M91p MT; it was my first dGPU) My WX 2100 is about 2 years old now (well, I bought it "open box" on eBay so there is no way to know for sure if it was actually new in 2020). Replacing it in 2 years would be reasonable. If I can still sell it for at least $30 or something then and get the A310 for less than $100 then I think it would be a decent deal. Performance will improve over time, especially on Linux (performance improvements were made recently even for the driver of the Northern Islands iGPUs in my laptops). I do think that having AV1 encoding/decoding is very valuable, maybe not $70 but the A310 will have some other advantages too undoubtedly (I think it will perform better for compute at least than the WX 2100), AV1 is becoming the new standard and it will stick around for a long, long time almost certainly. It does not have the patent mess of H.264 and H.265 and it saves a ton of space. I could let the A310 reencode lots of H.264 videos where ultimate quality is not important and save a ton of space (and therefore $). Now, I could get the A380 instead but I don't like that many cards need a power connector (and I don't want the factory OC anyway) and it is more expensive. Nvidia is not an option for me on Linux and too expensive and the RX 6400 would considerably more expensive than the A310 too and completely lacks any kind of encode.

I just hope that they will at least keep selling the two low-end cards (A310 and A380) for a long time (like the GT 1030), so us cashstrapped consumers can get them new at a good price, regardless of the broader success of the Arc series.
This is precisely why I said it seems like a poor plan - a performance sidegrade with encoding being the only real gain, which you yourself say isn't worth $70, plus theoretical future performance gains? That doesn't add up to something legitimizing a purchase to me, when you have something that seems to be working perfectly fine for what you need. It should indeed be better for compute, assuming Intel can get their drivers even moderately usable - it has ~2x the FP32 resoureces, after all. But knowing the state of Intel's Windows drivers, I wouldn't trust them to deliver decent compute support even there, let alone in Linux.
 
Joined
Jun 2, 2022
Messages
349 (0.39/day)
System Name HP EliteBook 725 G3
Processor AMD PRO A10-8700B (1.8 GHz CMT dual module with 3.2 GHz boost)
Motherboard HP proprietary
Cooling pretty good
Memory 8 GB SK Hynix DDR3 SODIMM
Video Card(s) Radeon R6 (Carrizo/GCNv3)
Storage internal Kioxia XG6 1 TB NVMe SSD (aftermarket)
Display(s) HP P22h G4 21.5" 1080p (& 768p internal LCD)
Case HP proprietary metal case
Audio Device(s) built-in Conexant CX20724 HDA chipset -> Roland RH-200S
Power Supply HP-branded AC adapter
Mouse Steelseries Rival 310
Keyboard Cherry G84-5200
Software Alma Linux 9.1
Benchmark Scores Broadcom BCM94356 11ac M.2 WiFi card (aftermarket)
This is precisely why I said it seems like a poor plan - a performance sidegrade with encoding being the only real gain, which you yourself say isn't worth $70, plus theoretical future performance gains? That doesn't add up to something legitimizing a purchase to me, when you have something that seems to be working perfectly fine for what you need. It should indeed be better for compute, assuming Intel can get their drivers even moderately usable - it has ~2x the FP32 resoureces, after all. But knowing the state of Intel's Windows drivers, I wouldn't trust them to deliver decent compute support even there, let alone in Linux.
Well, I guess as long as the WX 2100 is working it won't be a huge priority but if it dies then it will be a good replacement IMO. I think it is more attractive than an RX6400 for me though. Their compute support on Linux will almost certainly be better than AMD's. AMD has never fulfilled their promises (since Richland and I actually have one Richland laptop if you check my sig) of OpenCL acceleration basically making up for one Bulldozer-derivative module only having one FPU... But first I have to replace my 8-year-old monitor and HDD anyway and I want a new case because I am very dissatisfied with my current one.
 
Top