Wednesday, July 13th 2022
Intel's Arc A750 Graphics Card Makes an Appearance
Remember that Limited Edition card that Intel was teasing at the end of March? Well, it turns out that it could very well be the Arc A750 card, at least based on a quick appearance of a card in Gamer Nexus' review of the Gunnir Arc A380 card. For a few seconds in the review video, Gamers Nexus was showing off a card that looked nigh on identical to the renders Intel showed back in March. There was no mention of any specs or anything else related, except that Gamer Nexus has tested the card and that it will presumably be getting its own video in the near future based on what was said in the video.
Based on leaked information, the Arc A750 GPU should feature 24 Xe cores, 3072 FP32 cores and it's expected to be paired with 12 GB of GDDR6 memory on a 192-bit bus. For reference, the Arc A380 features eight Xe cores, 1024 FP32 cores and the cards ship with 6 GB of GDDR6 memory on a 96-bit bus. In related news, Intel is said to be touring some gaming events in the US promoting its yet unavailable Arc graphics cards. LANFest Colorado is said to be the first stop, so if you're planning on attending, this could be your first chance to get some hands-on time with an Arc graphics card.
Sources:
Gamers Nexus, @TheMalcore, via Videocardz
Based on leaked information, the Arc A750 GPU should feature 24 Xe cores, 3072 FP32 cores and it's expected to be paired with 12 GB of GDDR6 memory on a 192-bit bus. For reference, the Arc A380 features eight Xe cores, 1024 FP32 cores and the cards ship with 6 GB of GDDR6 memory on a 96-bit bus. In related news, Intel is said to be touring some gaming events in the US promoting its yet unavailable Arc graphics cards. LANFest Colorado is said to be the first stop, so if you're planning on attending, this could be your first chance to get some hands-on time with an Arc graphics card.
61 Comments on Intel's Arc A750 Graphics Card Makes an Appearance
Regarding the Intel's bias "theoretical" performance - 3Dmark is so manipulated and useless that there are no words to describe it, according it a380 is faster than rx 6500xt.
3Dmark is the same bullshit as geekbench for Apple and the sad thing is that many people like you are fooled into believing that this is theoretical performance or even actual. Of course that perfromance never become true. Тhis is how you win over deluded customers, not with technology, but with lies
Intel is sticking them in their NUC enthusiast and laptop products. Those products have always been costly (though you have always gotten your monies worth for what they are and in the case of the enthusiast killed everything else in it's form factor) and have never been crap or performed poorly. Speaking as someone who owned skull canyon, hades canyon, phantom canyon, along with some of their laptop kits as linux machines.
IMHO, I think this is going to end up a little under 3060 performance.
But a in-depth review proberly will devalue these cards significant for not just lacking performance but roughly 50% more power consumption then competition.
They have cheated with drivers. They released unofficial scores that would assume the cards would be equal or better then counterparts such as Nvidia or AMD, but the truth is these things are -15% behind pretty much, and the power requirement is quite more.
The reason why these are so delayed is obvious too; the first batch just did'nt perform and they had to spin it back with fixes in regards of performance. These are just compute based tiles or GPU's that did'nt meet quality standards and shipped as a "gamer" GPU.
It's the same we saw with Vega, Fuji and Polaris. They all do great in computational work but lack horsepower in games. If @W1zzard gets his hands on one i'd like to see the rumor leaks vs actual scores or performance.
Writing a driver for a small GPU or for a GPU are not any different. Nvidia and AMD have no problem scaling theirs from tiny low-end (or even integrated GPUs) to massive high-end GPUs, and they do so because the scaling happens in the GPU, not the driver.
Scaling hardware however, is a challenge, and for Intel the hardware is the problem. They perform well in synthetic benchmarks because they tend to create different workloads compared to games, typically designed to create a high load rather than rendering efficiently. If A750 have a similar resource balance to A380, then we can expect similar performance characteristics. While driver updates can fix the reported bugs, it will not change the performance characteristics of these products. The underlying problem is the effectiveness of the hardware scheduling, which would require new generations of hardware to solve.
FTFY :D
I just checked their wiki and it looks like it will do it for sure. Says: "Version 9 (Intel Arc Alchemist, Meteor Lake, Arrow Lake)Intel Arc Alchemist GPUs adds 8K 10-bit AV1 hardware encoding.".
I just saw another article writing: " Intel's ACM-G11 has world-class media processing capabilities (hardware decoding/encoding of 4K H.264/H.265/AV1 streams)"
As for ecosystem, beyond having a good profiling tool, the ecosystem is the API of choice along with its tools and accompanying documentation, which for PCs today usually means DirectX, Vulkan or OpenGL.
Intel's responsibility is to ensure their driver works according to the API specs. PC games are not written for specific hardware.