Thursday, August 11th 2022
Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill
Jon Peddie Research (JPR) provides some of the most authoritative and informative market-research into the PC graphics hardware industry. The firm just published a scathing editorial on the future of Intel AXG (Accelerated Computing Systems and Graphics), the business tasked with development of competitive discrete GPU and HPC compute accelerators for Intel. Founded to much fanfare in 2016 and led by Raja Koduri since 2016; AXG has been in the news for the development of the Xe graphics and compute architecture, particularly with the Xe-HP "Ponte Vecchio" HPC accelerator; and the Arc brand of consumer discrete graphics solutions. JPR reports that Intel has invested several billions of Dollars into AXG, to little avail, with none of its product lines bringing in notable revenues for the company. Xe-LP based iGPUs do not count as they're integrated with client processors, and their revenues are clubbed with CCG (Client Computing Group).
Intel started reporting revenues from the AXG business since Q1-2021, around which time it started selling its first discrete GPUs as the Intel DG1 Xe MAX, based on the same Xe-LP architecture powering its iGPUs. The company's Xe-HPG architecture, designed for high-performance gaming, was marketed as its first definitive answer to NVIDIA GeForce and AMD Radeon. Since Q1-2021, Intel has lost $2.1 billion to AXG, with not much to show for. The JPR article suggests that Intel missed the bus both with its time-to-market and scale.A sizable launch of Arc "Alchemist" in 2021 or early-2022, in the thick of the GPU supply crisis, would have enabled Intel to cash in on sales to whoever is in the market for a graphics card. With the supply crisis over in the wake of the crypto-currency mining demand for dGPUs, Intel finds Arc "Alchemist" competing with GeForce and Radeon products purely on gaming performance, where its fastest "Alchemist" product matches their mid-range products. Both NVIDIA and AMD are ready to ship their next-generation, which is bound to widen the performance gap with Intel even further. Besides graphics, NVIDIA and AMD are ready with even their next-generation scalar compute products, with NVIDIA's Hopper, and AMD's CDNA3, increasing the performance gap with "Ponte Vecchio."
With the recent axing of the Optane Memory business, which rode on the promise of the pioneering 3D XPoint memory technology that Intel invented, it's open-season on non-performing Intel businesses, especially with CEO Pat Gelsinger seeing favorable outcomes in Washington DC for legislation that makes business favorable for Intel and increases government subsidies for the company. JPR recommends that in light of the losses faced by AXG, Intel should consider selling the entire division off and exiting this market.
The JPR editorial can be read from the source link below.
Source:
Jon Peddie Research
Intel started reporting revenues from the AXG business since Q1-2021, around which time it started selling its first discrete GPUs as the Intel DG1 Xe MAX, based on the same Xe-LP architecture powering its iGPUs. The company's Xe-HPG architecture, designed for high-performance gaming, was marketed as its first definitive answer to NVIDIA GeForce and AMD Radeon. Since Q1-2021, Intel has lost $2.1 billion to AXG, with not much to show for. The JPR article suggests that Intel missed the bus both with its time-to-market and scale.A sizable launch of Arc "Alchemist" in 2021 or early-2022, in the thick of the GPU supply crisis, would have enabled Intel to cash in on sales to whoever is in the market for a graphics card. With the supply crisis over in the wake of the crypto-currency mining demand for dGPUs, Intel finds Arc "Alchemist" competing with GeForce and Radeon products purely on gaming performance, where its fastest "Alchemist" product matches their mid-range products. Both NVIDIA and AMD are ready to ship their next-generation, which is bound to widen the performance gap with Intel even further. Besides graphics, NVIDIA and AMD are ready with even their next-generation scalar compute products, with NVIDIA's Hopper, and AMD's CDNA3, increasing the performance gap with "Ponte Vecchio."
With the recent axing of the Optane Memory business, which rode on the promise of the pioneering 3D XPoint memory technology that Intel invented, it's open-season on non-performing Intel businesses, especially with CEO Pat Gelsinger seeing favorable outcomes in Washington DC for legislation that makes business favorable for Intel and increases government subsidies for the company. JPR recommends that in light of the losses faced by AXG, Intel should consider selling the entire division off and exiting this market.
The JPR editorial can be read from the source link below.
112 Comments on Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill
You are right, all tech companies try things all the time, but this is a big one, and its something that relies on a long breath. If the result now is like this, its a decent question to ask if they should continue. I mean, think from a shareholder's perspective, if they lost 3.5B already, and we all think and guesstimate they'll need a gen or two to gain parity (and even that is questionable, especially in high end where they have to also compete), what is the ROI here? Years? Decades?
They make far more different stuff than people realise plus they bought many companies including an FPGA one, I think they'll be fine personally.
I also don't agree with j peddling research either, stick with it but do better Intel.
Are they going sit that out? It's easily 2025 before you'll see more than half of what's getting released in DX12 only. If not later.
DX9? Sure. DX11? Suicide.
"Raja at AMD uncompetitive product and somewhat "bad" product"
"Raja goes to Intel, AMD GPU Div. got better and better"
"Raja at Intel overhyped and oversold promises and then fiasco launch of a somewhat "bad" product"
Intel indicated they need time to improve DX11 driver performance as it requires a lot of optimization work for each title.
Option three do what intel has already stated they are going to do
Raise cpu prices.
This mean less reason to commit to a specific architecture. Could be RISCV, could be ARM, could be X64, a lot of that market will be free and this is where the market will go. ARM will have a better chance but it do not mean it's going to win defacto. Some big cloud provider develop their own CPU, but that do not mean they will be viable in the long term.
But back to the topic, i think Intel need to continue to suffer to get good a GPU or else, it will just become a relic of the past. They need to be good at that because GPU are just going to become even more important in the future. That or a merge with Nvidia.
There's a reason why Raja was let go by AMD, now Intel is finding out why.
The drivers continue to have bad stability in a numbber of smaller-market games, so why would you bother buying this thing , when the 6600 already matches the 3060 for discount price?
When you already have two well-established competitors, its surprisingly hard to make the value proposition for more than a handful of folks., so its likely to take 5-10 more years before Intel turns a profit on this mess!
Anyone Remember s3 Chrome? Yeah, having competitive entry-level tech did not make up for the highly-variable performance! If you get very little money then you ahave lirttle to spend on drivers.
www.extremetech.com/computing/76978-s3-chrome-s27-review
Intel has the money to make-up the ground, but are they wiling to fork out that amount of change? The problem folks have here is you already know how bad the igp driver are today, so why would you suddenly thing that is "good enough? for you to whip-out discrete prices?
Wow. JPR analysts are PAID to write this garbage?
So who exactly is this customer?
Either Intel does this or they don't, end of story.
I could have saved them an entire day of editing and a week of typing.
0!
Sure, there is the possibility to win the lottery with your initial launch, but on a more realistic note, it takes a few generations to get it right.
Say that three times fast
Admittedly, it's not typical and most of the other offerings run in some sort of interpreter/JIT.
Intel pretty please, stop PR showing ("disclaimer: will not do that in real life scenario, keep in mind the test we do are in a totally selective and controlled environment.") number of your GPU being on level of XXXX concurrent model (and later being completely bogged down by drivers issues) offer Raja a good spot (in the janitor room maybe? Coffeer courrier? well anything but GPU related) and "same player try again" (for the 3rd time since Larabee, or 4th since i740, but i liked my real3d starfighter i740 :laugh: )
Intel is still trying to get Xe stable in mainstream games 2 years later - quite a long period
It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.