Thursday, August 11th 2022

Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

Jon Peddie Research (JPR) provides some of the most authoritative and informative market-research into the PC graphics hardware industry. The firm just published a scathing editorial on the future of Intel AXG (Accelerated Computing Systems and Graphics), the business tasked with development of competitive discrete GPU and HPC compute accelerators for Intel. Founded to much fanfare in 2016 and led by Raja Koduri since 2016; AXG has been in the news for the development of the Xe graphics and compute architecture, particularly with the Xe-HP "Ponte Vecchio" HPC accelerator; and the Arc brand of consumer discrete graphics solutions. JPR reports that Intel has invested several billions of Dollars into AXG, to little avail, with none of its product lines bringing in notable revenues for the company. Xe-LP based iGPUs do not count as they're integrated with client processors, and their revenues are clubbed with CCG (Client Computing Group).

Intel started reporting revenues from the AXG business since Q1-2021, around which time it started selling its first discrete GPUs as the Intel DG1 Xe MAX, based on the same Xe-LP architecture powering its iGPUs. The company's Xe-HPG architecture, designed for high-performance gaming, was marketed as its first definitive answer to NVIDIA GeForce and AMD Radeon. Since Q1-2021, Intel has lost $2.1 billion to AXG, with not much to show for. The JPR article suggests that Intel missed the bus both with its time-to-market and scale.
A sizable launch of Arc "Alchemist" in 2021 or early-2022, in the thick of the GPU supply crisis, would have enabled Intel to cash in on sales to whoever is in the market for a graphics card. With the supply crisis over in the wake of the crypto-currency mining demand for dGPUs, Intel finds Arc "Alchemist" competing with GeForce and Radeon products purely on gaming performance, where its fastest "Alchemist" product matches their mid-range products. Both NVIDIA and AMD are ready to ship their next-generation, which is bound to widen the performance gap with Intel even further. Besides graphics, NVIDIA and AMD are ready with even their next-generation scalar compute products, with NVIDIA's Hopper, and AMD's CDNA3, increasing the performance gap with "Ponte Vecchio."

With the recent axing of the Optane Memory business, which rode on the promise of the pioneering 3D XPoint memory technology that Intel invented, it's open-season on non-performing Intel businesses, especially with CEO Pat Gelsinger seeing favorable outcomes in Washington DC for legislation that makes business favorable for Intel and increases government subsidies for the company. JPR recommends that in light of the losses faced by AXG, Intel should consider selling the entire division off and exiting this market.

The JPR editorial can be read from the source link below.
Source: Jon Peddie Research
Add your own comment

112 Comments on Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

#51
trparky
GarrusIt took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Yeah well, I don't think Intel is used to losing. They're used to having a great product straight out of the gate. Internal company morale must have taken a major hit with this.
Posted on Reply
#52
iO
Not suprising as a lot of stuff that Intel touches turns to shit if it's not a traditional x86 design.
And if Battlemage turns out to be the same disaster as Alchemist, I'd say they will kill off their gaming GPU endeavours and refocus purely on HPC accelerators...
Posted on Reply
#53
Arkz
Bomby569What is the relevance in separating AMD from ATI when they bought the company, the IP's, the engineers, the know how, the buildings, the cleaning personal. So AMD started the bussiness from zero in 2006 is that what you are saying?
No, and well done on ignoring the rest.
Posted on Reply
#54
Unregistered
I agree, better sell the IPs to someone else (maybe AMD, to help them in streaming and encoding). Intel has been doing GPU for ages, no excuse for their failure with Arc. Maybe they could've focused on the low end, launching something better than a 1030 for cheap and go up from there, targeting the 3070 was very ambitious.
With ATI/AMD and nVidia we had a lot of competition graphics cards launched nearly every 6 months, it's only the last few generations that AMD struggled a bit, but since RDNA AMD is back (ignoring nVidia fanboys especially among reviewers) we will have competition again.
Posted on Edit | Reply
#55
trparky
Xex360no excuse for their failure with Arc.
I blame Raja for it. He was a failure at AMD, and now he's a failure at Intel. It's obvious he doesn't belong in an executive position. He couldn't organize a kegger in a brewery or an orgy in a whorehouse let alone a graphics card division.
Posted on Reply
#56
Crackong
I get it
Raja strikes again
Right?
Posted on Reply
#57
Minus Infinity
Bomby569You cannot look at all that surrounded the ARC launch and say there is even a glimpse of leadership, and i'm not talking about performance issues.
Maybe though it's more a case of Raja screwing up royally and the Intel upper management giving him too much freedom and not enough oversight.I think he will be in deep shit after this all comes to a head. The rest of the Intel departments are apparently fuming at what's going on in the gpu group. Arc desktop may be cancelled to focus on Battelmage but I doubt Raja would be leading that..

Now this is this a failing of Gelsinger or did he put too much trust in Raja and has learned his lesson.

I'm no Intel fan, but we need them to be competitive and a 3rd gpu player is most needed. I hope they don't bail on discrete gpus, but Arc has been an unmitigated disaster for them.
Posted on Reply
#58
AlwaysHope
A 3rd player launching a dGPU onto global markets for the first time in many yrs among the "post" pandemic world & its subsequent supply chain issues... what could go wrong?? :rolleyes:
Combine that with rising geopolitical tensions in East Asia where most of the manufacturing is done ....
I'm sure Intel had a crystal ball yrs ago & could EASILY see this mess! :laugh:
Posted on Reply
#59
usiname
Bomby569What is the relevance in separating AMD from ATI when they bought the company, the IP's, the engineers, the know how, the buildings, the cleaning personal. So AMD started the bussiness from zero in 2006 is that what you are saying?
So Intel start their GPU bussiness today, is that what you are saying? You forget decades of IGPUs from Intel, and the most important - Xe is already 3 years old, they had 3 years to fix their drivers and learn how to do it properly, but they didn't. Do you seriously believe that IGPU is something different than GPU in terms of drivers and hardware?
Posted on Reply
#60
DemonicRyzen666
AMD took a giant risk in buying ATI & everyone claimed AMD would go bankrput. Here we are 16 year later AMD has high-end gpu's & everyone is forgetting that AMD had to settle for making mid-range graphics cards for a very long time. In the long run Intel better be ok with more than a 3.5 billion dollar drop cause for AMD it was 5.4 Billion dollars to lose + the other Losses.
Posted on Reply
#61
KV2DERP
Damn, I just want to believe Intel could finally enter dgpu market. Yes, it's not going to be a smooth sailing ride especially when your competitor has been in it like, idk 20+ years?

I was expecting their 1st product to have flaws here and there, and I could reason with that. But killing this before it even started? Damn.

Just my thoughts, hey who am I anyway? I'm just a random guy over the internet after all.
Posted on Reply
#62
Bwaze
Who in their right mind would listen to "Jon Peddie Research"? I remember they published at the height of cyptocraze when all the cards, even the cheapest ones were selling for at least twice their MSRP, that "25% of cards were sold to crypto miners".

They're just throwing out some guesses that all the rest of us can in forums.

Yes, Intel would perhaps be better of by selling or axing their discrete GPU branch, like they did Optane. At least in the short run. But I don't think they will. ARC, as late and underpowered and undeveloped it is, at least shows that they can enter the discrete GPU business, competively maybe with next generation. And even if they fudge that one too and it takes too long to compete with 2022 Nvidia and AMD generations, I think they are in for the long run, to learn how to do GPU computing, not just gaming.
Posted on Reply
#63
dragontamer5788
Intel needs to take a risk here. Intel already gave up on Optane and cell-phone chips. Intel also killed Xeon Phi and that entire ecosystem for this path. Seems a bit late to get cold feet over this. Intel must create a high performance computer, just to remain competitive at the supercomputer level. GPUs / SIMD-compute is the future, and there's no other way to reach exascale unless they go SIMD/GPGPU.

The gaming-portion is basically to help fund the research into the high-end stuff. Much like how NVidia uses lower-end 3060 GPUs to fund the research of $10,000 A100 server GPUs, Intel needs to get the video-game market hooked onto some product that supports the stack. That's the only way this will be economically feasible.

Now maybe the PC market is too difficult to break into. Then Intel should instead try with the console market first, or something else along those lines. Or maybe Intel makes a compute-only chip and finds a new market of buyers (but unlikely, video gamers have shown to be wanting good chips and funding this kind of SIMD-compute research). The only other niche is maybe the AI / Deep Learning fellows. But casting a wide net and getting as many customers as possible is just the most obvious strategy here.
Posted on Reply
#64
Bwaze
Also, "$3.5 billion hole" is the estimate of the whole project's cost since 2016, in 6 years. Intel's yearly revenue is $80 billion!

They are in a free fall right now, last quarter was 22% down compared to last year, but so is the rest of the market. We are in a recession, whatever the definitions say.

I'm sure many shareholders would welcome axing such an expensive project for momentary gain, no matter the potential. And that's what might actually happen.
Posted on Reply
#65
jigar2speed
Vayra86

This engineer has no business at executive level jobs.

Overpromise and underdeliver is not where you want to be at this level. Its where you absolutely should not be. That is the reason for 'Poor Vega' and now/soon Poor Arc.
This guy was also responsible for HDR gaming that we have today. Anyway that's not the point, this was his first attempt at Intel and secondly at AMD they were budget strapped so he has always been taking job against all odds.
Posted on Reply
#66
AusWolf
A bad suggestion, imo. Even if it's unprofitable, selling something is still better than nothing.
Posted on Reply
#67
Steevo
The need for a SOC to remain competitive is a requirement, AMD is eating their lunch, their lack of a GPU is all that’s missing
Posted on Reply
#68
GreiverBlade
defaultluserif they can keep that up for multiple releases, then I'll put thenm ion the same stability level as Nvidia
i understand your point of view, but for me it's the opposite, Nvidia have been unstable for me and ATI/AMD i never had any single issue since 1999 with up to date drivers :oops:
ATI/AMD : from a Mach64 LT, a few Rage, a X1650 Pro passing by a X1950 Pro a HD3650 3670 4870 a 5XXX i don't remember :laugh: 7870 R9 270 290 and then RX 6700 XT (and pobably a few more i forgot)
Nvidia: from Tnt1 , Geforce 256, Geforce 2 GTS, :oops: MX400 :oops: 460 480, 560 580 (even a 580 SLI of Matrix Platinum) 670, 770, 980 and lastly 1070.

i spent enough time with them since then :laugh: nonetheless i's down to personal point of view and luck, in my case, i guess :ohwell:

edit: as i said, i really loved the Real3d Starfighter i740, even tho it was sh!t performance wise in 2D and utterly weak in 3D (i had a 8mb version ) (although drivers were fine with that one :oops: )
and i will not talk about the S3, Permedia, PowerVR and 3Dfx i owned.
Posted on Reply
#69
medi01
GreiverBladeacually if he AIB do not buy them (in the news www.techpowerup.com/297490/...opping-production-encountering-quality-issues )hen the product is not, indeed, selling, no? :laugh: (well in China do the A380 sells well? i wonder...)
My point is it is not about single product, it's about Intel carving out discrete GPU market. It is a journy, not a step.

As for that link, it is looks more like cypto hangover than anything else.

Now, how bad is being 13% behind RDNA1? Not great, to put it softly, but not an unrecoverable catastrophe either. Especially given Mr Koduri's involvement...

Posted on Reply
#70
Jimmy_
Vayra86

This engineer has no business at executive level jobs.

Overpromise and underdeliver is not where you want to be at this level. Its where you absolutely should not be. That is the reason for 'Poor Vega' and now/soon Poor Arc.
Agreed with your point

I think we need to look into the big picture - long term. With introductory ARC the performance is not that bad tbh as every new product ( with a new arch.) needs time to flourish and become stable. I think intel can create an impact in the dGPU market in the coming years.

Slowly it will get there!
Posted on Reply
#71
medi01
GarrusIt basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Context matters.

AMD was a financially starving company with shrinking market share.

At some point (and it certainly wasn't 7 years) they bet on Zen architecture and then it delivered.

Once GPU division got proper funding IN A MERE YEAR AMD was able to hit with RDNA1, which was a major leap forward, essentially closing perf/mm2 gap and perf/watt gaps vs team green. One more year, and RDNA2 disrupts NV lineup, forcing Huang to drop a tier (that is why we got 10GB 3080 and 12 GB on lower end cards). The only reason we haven't seen that havoc in NV financials is the crypto bubble.

So, bottom line is, it doesn't take 7 years for a competent GPU manufacturer to roll out serious competition.

Intel, of course, is rather lacking respective experience, but... perhaps 13% worse than RDNA1 for the first product is not that bad.
Posted on Reply
#72
GreiverBlade
medi01My point is it is not about single product, it's about Intel carving out discrete GPU market. It is a journy, not a step.

As for that link, it is looks more like cypto hangover than anything else.

Now, how bad is being 13% behind RDNA1? Not great, to put it softly, but not an unrecoverable catastrophe either. Especially given Mr Koduri's involvement...

oh, do not misread me either, i want Intel to succeed but because it's Intel, i find these recent news, laughable to say the least ... it's not their first graphic card rodeo, but they are making beginner mistakes :cry:
medi01Once GPU division got proper funding IN A MERE YEAR AMD was able to hit with RDNA1, which was a major leap forward, essentially closing perf/mm2 gap and perf/watt gaps vs team green. One more year, and RDNA2 disrupts NV lineup, forcing Huang to drop a tier (that is why we got 10GB 3080 and 12 GB on lower end cards). The only reason we haven't seen that havoc in NV financials is the crypto bubble.

So, bottom line is, it doesn't take 7 years for a competent GPU manufacturer to roll out serious competition.

Intel, of course, is rather lacking respective experience, but... perhaps 13% worse than RDNA1 for the first product is not that bad.
AMD GPU division got better after Raja left ... that's also a context ... (and also why i did not go Vega, although for after he left i did not take a GPU since i was stuck with my 1070 due to cryptocraze :laugh: ) yeah Arc is not good, certainly not up to what Raja or anyone at Intel said (remember the "Intel is promising to deliver loads of GPU to gamers"? ) and they are just behind the current gen (or previous gen in some cases) while AMD and Nvidia are literally readying next gens with a sizable jump in perf from the current gen (well, i am happy with my RX 6700 XT her perfs will not go away with RTX 40XX and RX 7X00)

in short what did happen for both enterprise? Raja happened :laugh: (kinda sad tho ... )
Posted on Reply
#73
medi01
GreiverBladeAMD GPU division got better after Raja left ...
Yeah, but to be fair to Raja, GPU division was severely underfunded in his times.
Posted on Reply
#74
ZoneDymo
medi01Yeah, but to be fair to Raja, GPU division was severely underfunded in his times.
I generally dislike the hate Raja gets, its pretty much never 1 person, especially not on this scale, sure its easier for the mind to deal with that it would be....but it isnt.
Look at the star wars prequels, that is what happens when you think its all 1 person responsible.....
Posted on Reply
#75
medi01
ZoneDymoI generally dislike the hate Raja gets, its pretty much never 1 person, especially not on this scale, sure its easier for the mind to deal with that it would be....but it isnt.
In terms of performance of a product developed - agreed.

But in terms of embarassing statements made - it is absolutely solely Raja. It stopped the second he left AMD at AMD and it started to pop out of Intel once got active there.
Posted on Reply
Add your own comment
Jan 2nd, 2025 10:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts