Friday, January 21st 2022
Intel Arc Alchemist Xe-HPG Graphics Card with 512 EUs Outperforms NVIDIA GeForce RTX 3070 Ti
Intel's Arc Alchemist discrete lineup of graphics cards is scheduled for launch this quarter. We are getting some performance benchmarks of the DG2-512EU silicon, representing the top-end Xe-HPG configuration. Thanks to a discovery of a famous hardware leaker TUM_APISAK, we have a measurement performed in the SiSoftware database that shows Intel's Arc Alchemist GPU with 4096 cores and, according to the report from the benchmark, just 12.8 GB of GDDR6 VRAM. This is just an error on the report, as this GPU SKU should be coupled with 16 GB of GDDR6 VRAM. The card was reportedly running at 2.1 GHz frequency. However, we don't know if this represents base or boost speeds.
When it comes to actual performance, the DG2-512EU GPU managed to score 9017.52 Mpix/s, while something like NVIDIA GeForce RTX 3070 Ti managed to get 8369.51 Mpix/s in the same test group. Comparing these two cards in floating-point operations, Intel has an advantage in half-float, double-float, and quad-float tests, while NVIDIA manages to hold the single-float crown. This represents a 7% advantage for Intel's GPU, meaning that Arc Alchemist has the potential for standing up against NVIDIA's offerings.
Sources:
SiSoftware Benchmark Database, @TUM_APISAK (Twitter), via VideoCardz
When it comes to actual performance, the DG2-512EU GPU managed to score 9017.52 Mpix/s, while something like NVIDIA GeForce RTX 3070 Ti managed to get 8369.51 Mpix/s in the same test group. Comparing these two cards in floating-point operations, Intel has an advantage in half-float, double-float, and quad-float tests, while NVIDIA manages to hold the single-float crown. This represents a 7% advantage for Intel's GPU, meaning that Arc Alchemist has the potential for standing up against NVIDIA's offerings.
95 Comments on Intel Arc Alchemist Xe-HPG Graphics Card with 512 EUs Outperforms NVIDIA GeForce RTX 3070 Ti
Its not in the dozens,
Its not in the hundreds,
Its in the tens of thousands.
Intel is literally producing crates of platform components and ship them all across the world before a launch of such platform
Are you aware of how NDA works and companies used to sue someone selling or leaking info of an unreleased product obtained via evaluation channels ?
While on the topic, this is yet one of many reasons why enthusiasts should be rather displeased with Intel using TSMC for any products, CPU or GPU, it's just going to make things worse in both markets. Don't even get me started on CPUs, but nobody should be happy about Intel stealing capacity away from AMD. It's just another instance of Intel leaning on their financial power to beat AMD instead of out innovating them...I never miss a chance to mention the fact that Intel's annual R&D budget is over 650% larger and their annual revenue is over 800% larger than AMD's respectively...this is also the reason why nobody should be impressed by Alderlake outperforming over a year old Zen3 CPU by 10% or less on average, and why everybody should be impressed with AMD beating Intel for the last few years and why AMD's previous process node advantage serves as absolutely no excuse for Intel's mediocre performance based on how they literally have every single financial and resource advantage over AMD. In a just world, Intel should be forced to use their own fabs...at least until AMD achieves around 50% of the mobility (laptop) and enterprise x86 markets, the most lucrative x86 markets, which they are light-years away from. This is why, with respect to ensuring long term competition in the market, nobody should be claiming "it's great that Intel is competing again", especially considering that with the current shortages, nothing is lowering prices in the short term, and based on financial realities, AMD's current position is extremely precarious compared to Intel's. All it would take is a few years/a couple generations of supply shortage woes for AMD, even IF they have performance parity or even slightly beat intel, to stagnate AMD's revenue (something that Intel can easily weather), contract their market gains, and return us to the pre-ryzen dark ages of Intel hegemony, 4% generation over generation performance "gains", and overpriced CPUs that offer nothing over the previous one for all intents and purposes.
And the same thing can be said for the dGPU market as well, Intel doesn't truly represent a new competitor in the same way that a whole new entity would represent. Intel can easily throw around their financial weight in ways even worse than Nvidia has been guilty of, squeeze AMD out (which AMD I'm sure would have no problem abandoning and consolidating down to exclusively semi-custom and IP development for other companies), and leave us in an even worse position...can you imagine the cartels, monopolistic and uncompetitive practices Intel and Nvidia would engage in if left alone to their own devices in the videocard market?
Sorry for the rant, but I felt it necessary to counter the seeming universal praise I see for Intel entering the GPU market and stretching TSMC'S supply even thinner.
AMD,NVidia are moving to 5nm, SO intel can have 6nm and 3nm. Better not release this GPU if it can mine crapto at more than $1day.
The Intel TSMC question is jus Intel utilising resources to the max, they are building new fabs at a fast pace with Ireland coming on line now and Ohio next so production and most important stock should start improving across the board. Both AMD and Nvidia have been absolutely terrible this time around with there GPU launches and no one can tell me they haven't jumped on the misery of gamers by upping prices, controlling production and utilising terrible tactics to drive up prices even with the current supply chain issues. Without a doubt they could have produced more once they realised what was happening, sadly the $$$ in supply and demand lit them up and they took advantage. Of course both AMD and Nvidia could have launched even more gimped GPU's for mining or even just ramped production. The RTX 3080 is a great example of one of the best GPU's ever produced at $700 only for Nvidia to realise that they could have got so much more money and thus the launch of the 3080 12GB..and AMD with the RX5600XT which I personally cannot believe they launched on a x4 bus with ne encoding and 4GB even after they said 4GB is not good for gamers! One only needs to look at the profits at AMD and Nvidia to realise they are focussed completely on the shareholders and certainly not on there customers. They are business but boy have they gone of the deep end widening there profit margins.
No I do not think we are going back to the old days as Alder Lake showed not only a significant increase in performance but also they did not whack the prices up as well. I personally cannot understand why AMD gave up on the low end of the CPU market and started upping there prices of CPU's. The 12400 at $180 is something AMD have nothing to compete against.
Maybe I am just an optimist but we are living in an age where we have immense CPU performance for not a lot of money and thank God for AMD in bringing much needed competition! the GPU market has obviously gone of a cliff in part due to crypto and in part due to greed from AMD, Nvidia some AIB's and scalpers but ultimately production will fix this and of course having a third GPU player in Intel will hopefully improve things.
Amd become such of a shit Company since ryzen, i will buy a DG2 from Intel with similar performance to a 6500XT for a normaly price.
Nvidia will be the King for sure but AMD would get a problem and then we need a new Price War.:)
Its good for every one who use A GPU for more than to display anything static.
Come on a 6500XT for 199$ MSRP is a Joke with 64bit and x4 Lanes,
a GT 1030 for 100$ have 64bit and x8 Lanes.
Or such Old GPU like a GT 710 with GDDR5 have 64bit and x8 Lanes. (mine only x1 cause its a x1 GPU and fit in every Slot)
That being said it would be great if Intel can effectively compete against a 3070Ti especially knowing Intel GPUs will come up from the getgo with a temporal upscaling solution unlike AMD , thus being able to not only compete on rasterization ( like AMD ) but also against game changing features such as DLSS.
Overall this will push both NVIDIA and AMD ( AMD especially ) to be less conservative with their future architectures and will brake the historic duopoly which can only benefit the end user .
Will the current supply/demand dynamics dramatically change with Intel dgpu? Unlikely, but there's a lot more to it than "they'll only strain tsmc even more!"
Its not about money, its about viable product. I do think Intel likes to not repeat what AMD has been doing since it bought ATI for its GPUs. A break even was a good year in their books. Also this is obviously not 3070ti perf, it could easily be 3060, we just don't know. Let alone RT perf or how they compare or fall off when both are in use.
We've already seen Raja swing his 4P chips around like he's compensating something else, but really... size matters. A lot. We've seen this all before. Huge chips don't fly for consumer markets. So far, there's a lot of boxes Intel still needs to be ticking for any semblance of success.
Nvidia did ok elsewhere, they're bottom end didn't fall out, in reality it's taken four years for AMD to poke Intel into doing something, because they're bottom line kept growing anyway despite loosing market share.
It'll come down to driver support and Intel is behind significantly there oh and actually f£#@£& releasing something.
But its not water tight. Intel doesn't have full grip on these things, and much like how medias like Gamers Nexus did, often you can go on Ebay and buy ES CPUs much before products hit the shelves and those sellers are most of the times don't get caught. Documentation on this stuff is often a bit loose. A medium sized manufecturer can end up with some samples it does not need anymore, or that got updated with newer ones
You should be aware of how awful it was, when we saw "New leaks" before the 12th gen launch day.
The "leaks" did not last for few days, or a week, they last for a whole month.
Every single working day within that month, a "New leak" popped up.
We all know NDA isn't water tight
But this?
This a swiss cheese.
As I 've mentioned before, this is either intentional, or Intel should fire the whole PR team for that.
Let Intel play GPU game and see where it leads them. For now I can smirk at it.
So Cuda will be second best in Adobe suite.
Yeah, no. Just no. You must not have looked around too much the last decade to say the above... Leaks are 70% marketing if not more and it happens at every company and product release these days.
Even in politics 'the leak' is a tried and tested tool to gauge the public response before actually finalizing an idea.
If Intel makes a good product, people will buy it.