Monday, August 8th 2022
![Intel](https://tpucdn.com/images/news/intel-v1721205152158.png)
Intel Unveils Arc Pro Graphics Cards for Workstations and Professional Software
Intel has today unveiled another addition to its discrete Arc Alchemist graphics card lineup, with a slight preference to the professional consumer market. Intel has prepared three models for creators and entry pro-vis solutions, called Intel Arc Pro graphics cards. All GPUs are AV1 accelerated, have ray tracing support, and are designed to handle AI acceleration inside applications like Adobe Premiere Pro. At the start, we have a small A30M mobile GPU aimed at laptop designs. It has a 3.5 TeraFLOP FP32 capability inside a configurable 35-50 Watt TDP envelope, has eight ray tracing cores, and 4 GB of GDDR6 memory. Its display output connectors depend on OEM's laptop design.
Next, we have the Arc A40 Pro discrete single-slot GPU. Having 3.5 TeraFLOPs of FP32 single-precision performance, it has eight ray tracing cores and 6 GB of GDDR6 memory. The listed maximum TDP for this model is 50 Watts. It has four mini-DP ports for video output, and it can drive two monitors at 8K 60 Hz, one at 5K 240 Hz, two at 5K 120 Hz, or four at 4K 60 Hz refresh rate. Its bigger brother, the Arc A50 Pro, is a dual-slot design with 4.8 TeraFLOPs of single-precision FP32 computing, has eight ray tracing cores, and 6 GB of GDDR6 memory as well. It has the same video output capability as the Arc A40 Pro, with a beefier cooling setup to handle the 75 Watt TDP. All software developed using the OneAPI toolkit can be accelerated using these GPUs. Intel is working with the industry to adapt professional software for Arc Pro graphics.
Next, we have the Arc A40 Pro discrete single-slot GPU. Having 3.5 TeraFLOPs of FP32 single-precision performance, it has eight ray tracing cores and 6 GB of GDDR6 memory. The listed maximum TDP for this model is 50 Watts. It has four mini-DP ports for video output, and it can drive two monitors at 8K 60 Hz, one at 5K 240 Hz, two at 5K 120 Hz, or four at 4K 60 Hz refresh rate. Its bigger brother, the Arc A50 Pro, is a dual-slot design with 4.8 TeraFLOPs of single-precision FP32 computing, has eight ray tracing cores, and 6 GB of GDDR6 memory as well. It has the same video output capability as the Arc A40 Pro, with a beefier cooling setup to handle the 75 Watt TDP. All software developed using the OneAPI toolkit can be accelerated using these GPUs. Intel is working with the industry to adapt professional software for Arc Pro graphics.
47 Comments on Intel Unveils Arc Pro Graphics Cards for Workstations and Professional Software
*Forgot to mention, I mostly play World of Tanks, King's Bounty, and Divinity: Original Sin 2. If & when driver crashes occur, its with World of Tanks or any of the King's Bounty series. I haven't experienced any crashes with Divinity: OS2.
we need a 3rd player for consumer ... imho, we will not get one this year ... :p (that's sad but not unexpected )
although in the "Pro" domain ... there are more than 2 player, iirc, Matrox is still around (and does not do consumer cards )
The king of pro cards is the quadro. It's partly drivers that do it, but it's also that it's just certified for more applications. You have situations where AMD is the better raw compute card but if you are in say Autodesk or Solidworks the quadro is going to spank it silly.
In the consumer market you can see gaming benchmarks and while there are outliers for the most part you have a good idea where on the performance tier you are going to be and then cross fingers and pray this gets better through drivers. In the pro market this is not true at all. You pick the purpose you are going for and the application(s) you are going to be using and then buy accordingly. Software companies usually target the best hardware as the cost of this software is $$$$$. It will not get better.
One of the keys to the apple M1 taking off the way it did was while it's not ideal at all for gaming graphics for the types of workloads it's focusing on it's very very very good, and then companies went out and optomized around it.
I'd expect intel will do much better in the pro market than the consumer market. If the compute is at all remotely good and the decoder is good companies will tweak for it. I highly suspect that intel intends these type of units to be smashed into boxes that are like their high end pro series NUCs that either sit headless on someones desk for compute type work or do video type stuff. It's not meant to compete with someone running a fatter socket xeon with gobs of RAM and a quadro smashing solidworks. VIA tried in the consumer market and got crushed. The pro market is different. Cards go up for specific tasks. Gaming is just "plays games".
And yeah, VIA.... I'm going to die laughing if I'll give my honest opinion of their sub-par iGPUs.
Sitting around high fiving over master race, graphics, and other nonsense is how we all got stuck with two companies.
But after all, my point is mostly that I hope that Intel brings some competition on the GPU market. For a true budget gamer who buys the cheapest Ryzen or Intel's F SKU, a capable low-end card is welcome for them. GT 1030 was initally somewhat a good product, and IMO it was the last (de facto) low-end card which was usable in a budget gamer's setup. (Not the DDR4 scam version, we should totally forget that it even exists)
For what it's worth their gurlz of destruction gamer house didn't do very well in Quake 4. The whole thing was dumb. First, VIAs GPU was between an 8500gt and a 9400gt despite costing more and being less power efficient... none of the three could run Quake 4 at the needed FPS for competition. Next, Quake 4, well... it sucked. It ran like shit, played like shit, and it wasn't until a ton of modding that it was "good". The house was obviously as cash in on sex appeal and while the "gurlz" (why do that, just girls, or gals, who does this shit?) were good on the womens circuit the goal was to place them on the boys circuit. This resulted in a parade of fail and bad. Not only could the gurlz not compete on any real level the guys paid to train with them all of a sudden got crushed after by other guys they could beat. Que finger pointing. Was it the events, was it VIA, was it sexism, was it lack of training and mostly branding, was it just that the guys circuit inantely had more top talented players, was it that Quake 4 sucked? Or was it just that not a damn person involved in this thought a damn thing through beyond $$$$$. I'm going with the last. I've been involved in similar fiascos even in running gaming stuff where it all blew up into a glorious mess.
The funniest part is that this took out VIAs dreams of being a graphics card company. Maybe they should have put anime porn stickers on the graphics cards? Seems to work in Asian markets. Or I dunno rip off ASUS and just Gundam that shit up.
Back to the main topic. The issue is we are all going integrated SOCs sooner or later. The steam deck but scaled up or down is the future.
But yeah, it already sucks that when you buy a laptop, you're stuck with its GPU, unless you have tools and rework skills. And when there's this climate bullshit going on, I think the opposite would be better to minimaze waste AKA let the users clean, maintenance and upgrade their laptops if they want. But no, it's the exact opposite.
Not sure how the Graphics card line-up will take intel? up the hills or down the valley :)
In many professional applications like AutoCAD, Solidworks, Maya, Premiere, Photoshop, various simulation tools, etc. these cards may actually perform pretty well. Combined with good video encoding, these might end up as popular products in certain segments.
yes, I totally agree with your point that it can come up as a popular product unless they don't mess up the drivers :)