Thursday, August 11th 2022

Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

Jon Peddie Research (JPR) provides some of the most authoritative and informative market-research into the PC graphics hardware industry. The firm just published a scathing editorial on the future of Intel AXG (Accelerated Computing Systems and Graphics), the business tasked with development of competitive discrete GPU and HPC compute accelerators for Intel. Founded to much fanfare in 2016 and led by Raja Koduri since 2016; AXG has been in the news for the development of the Xe graphics and compute architecture, particularly with the Xe-HP "Ponte Vecchio" HPC accelerator; and the Arc brand of consumer discrete graphics solutions. JPR reports that Intel has invested several billions of Dollars into AXG, to little avail, with none of its product lines bringing in notable revenues for the company. Xe-LP based iGPUs do not count as they're integrated with client processors, and their revenues are clubbed with CCG (Client Computing Group).

Intel started reporting revenues from the AXG business since Q1-2021, around which time it started selling its first discrete GPUs as the Intel DG1 Xe MAX, based on the same Xe-LP architecture powering its iGPUs. The company's Xe-HPG architecture, designed for high-performance gaming, was marketed as its first definitive answer to NVIDIA GeForce and AMD Radeon. Since Q1-2021, Intel has lost $2.1 billion to AXG, with not much to show for. The JPR article suggests that Intel missed the bus both with its time-to-market and scale.
A sizable launch of Arc "Alchemist" in 2021 or early-2022, in the thick of the GPU supply crisis, would have enabled Intel to cash in on sales to whoever is in the market for a graphics card. With the supply crisis over in the wake of the crypto-currency mining demand for dGPUs, Intel finds Arc "Alchemist" competing with GeForce and Radeon products purely on gaming performance, where its fastest "Alchemist" product matches their mid-range products. Both NVIDIA and AMD are ready to ship their next-generation, which is bound to widen the performance gap with Intel even further. Besides graphics, NVIDIA and AMD are ready with even their next-generation scalar compute products, with NVIDIA's Hopper, and AMD's CDNA3, increasing the performance gap with "Ponte Vecchio."

With the recent axing of the Optane Memory business, which rode on the promise of the pioneering 3D XPoint memory technology that Intel invented, it's open-season on non-performing Intel businesses, especially with CEO Pat Gelsinger seeing favorable outcomes in Washington DC for legislation that makes business favorable for Intel and increases government subsidies for the company. JPR recommends that in light of the losses faced by AXG, Intel should consider selling the entire division off and exiting this market.

The JPR editorial can be read from the source link below.
Source: Jon Peddie Research
Add your own comment

112 Comments on Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

#76
LabRat 891
'dun like Intel.
'dun like Raja.
All that said:
Give them a friggin chance; jeeze.
Intel only 'gave up' on Optane after several years and product lines 'not panning out'.
AFAIK, there is no Intel Alchemist release in the 'States at all, yet. If they hang back out of our market for the first gen, okay. Let's see how Gen 2 goes.
Posted on Reply
#77
lexluthermiester
medi01Yeah, but to be fair to Raja, GPU division was severely underfunded in his times.
This is true. At that time AMD didn't have the money they have now to allocate. Raja is not the incompetent fool some seem to think he is.
Posted on Reply
#78
Dirt Chip
Seems like RIPing Intel's ARC way too soon.
It's like looking at a new factory with 10 billion in investment and dicidin day 1 that it's not profitable. But any HW investment isn't profitable day 1 (or 2 month in this case), it takes time, be patient (or don't, and keep wabble about how much intel sucks- all is good).

They don't have a cash problem so I see no point in ending something that haven't really started yet.
Wait a few months and see the driver progress.
Posted on Reply
#79
auxy
lexluthermiesterThis is true. At that time AMD didn't have the money they have now to allocate. Raja is not the incompetent fool some seem to think he is.
Fool me once, shame on you. Fool me twice, well, you won't get fooled again.
Posted on Reply
#80
Unregistered
DemonicRyzen666AMD took a giant risk in buying ATI & everyone claimed AMD would go bankrput. Here we are 16 year later AMD has high-end gpu's & everyone is forgetting that AMD had to settle for making mid-range graphics cards for a very long time. In the long run Intel better be ok with more than a 3.5 billion dollar drop cause for AMD it was 5.4 Billion dollars to lose + the other Losses.
Thing is ATI kept AMD afloat, their CPUs were couldn't compete with Intel's (except maybe in some special situations), but the GPUs were right up there with nVidia.
I would say buying ATI saved them.
#81
GreiverBlade
medi01Yeah, but to be fair to Raja, GPU division was severely underfunded in his times.
yeah true, but at Intel the division should have been overfunded and still underdelivered (tm?)

tbf, i do not even dislike Raja and i want Intel to succeed (finally) at discreet graphic ... but for now ... i am heavily underwhelmed :ohwell:
Posted on Reply
#82
lexluthermiester
WeeRabIncredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
Some of us are successful business owners and actually do have a lot of experience in this industry. So there is that...:rolleyes:
Posted on Reply
#83
TheinsanegamerN
Xex360Thing is ATI kept AMD afloat, their CPUs were couldn't compete with Intel's (except maybe in some special situations), but the GPUs were right up there with nVidia.
I would say buying ATI saved them.
Buying ATI put AMD in massive debt, millions of dollars that should have been put into AMD CPU design instead. Debt that haunted AMD for over 15 years.

You cant take credit for fixing an issue when you've directly caused this very issue.
Posted on Reply
#84
DeathtoGnomes
WeeRabIncredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
My armchair has so much top level expertise it lets me travel at the speed of light. To be fair, it doesnt take a rocket scientist to understand how a rocket works. Some people like to know specific details, not everyone likes that same details so doesnt bother trying to learn the same details as the next guy. We gather here to share what we know. Period.

/ontopic

It was predicted that intel would take a huge loss before a respectable competiticve card was brought to market. I thought that number would be in billions, may even double digits. That depends on how persistent Intel want to be and how much abuse Intel will take from shareholders before giving up. This is just the first instance of some entity suggesting Intel jump off a long pier into a deep lake.
Posted on Reply
#85
Vayra86
WeeRabIncredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
Maybe you ought to revisit the definition of what a forum is supposed to be.

;)

And next, consider how your post adds anything to it.
Posted on Reply
#86
lexluthermiester
DeathtoGnomesIt was predicted that intel would take a huge loss before a respectable competiticve card was brought to market.
This is only common sense and to be expected.
Posted on Reply
#87
trparky
lexluthermiesterRaja is not the incompetent fool some seem to think he is.
Well, ok. He may not be that incompetent fool, but you can, as @medi01 said, he doesn't know how to reign in expectations. Hence this quote...
medi01But in terms of embarassing statements made - it is absolutely solely Raja. It stopped the second he left AMD at AMD and it started to pop out of Intel once got active there.
The guy just doesn't know how to set expectations. Always underpromise, always. And then when it overdelivers, great; people will think of you as a miracle worker. But if you overpromise and then underdeliver as we've recently seen, then you get this kind of shit.
Posted on Reply
#88
Unregistered
TheinsanegamerNBuying ATI put AMD in massive debt, millions of dollars that should have been put into AMD CPU design instead. Debt that haunted AMD for over 15 years.

You cant take credit for fixing an issue when you've directly caused this very issue.
Given the bribing of Intel I don't think it would've helped to have a better CPU, Athlons were way better than Pentium 4 yet AMD struggled to gain market share.
#89
GhostRyder
GarrusIt basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Its definitely true for the Bulldozer to Ryzen, they took a huge beating from betting early that more cores = better and made the cores too weak to compete in the high end market (Although I am ignoring some of the blatant anti-compete stuff that happened in this time frame as well).

As for the GPU's, the first generation R series (R9 290X, 290, 280X etc) was a great series. They really struggled after that and made poor investment choices in HBM early which was expensive and made it hard to put more than 4gb HBM on the first generation making them inferior in many aspects to Nvidia's offerings. It took years but they are finally competitive again on the GPU front.

Honestly, Intel needs to double down and invest in their GPU's. Not just for consumers to have alternate options, but because they are going to need some alternative revenue streams. Not to mention at this point cutting it off would be a waste and not only damage their reputation, but have them write off a huge amount of money.
Posted on Reply
#90
Vayra86
jigar2speedThis guy was also responsible for HDR gaming that we have today. Anyway that's not the point, this was his first attempt at Intel and secondly at AMD they were budget strapped so he has always been taking job against all odds.
LOL. Yeah. That tech that barely works on 95% of all monitors on the market.

Fan-tas-tic. Against all odds indeed - now, that's my point as well. Apparently he knows how to present significant risk as a good story, and that results in overpromise/underdeliver. And part of that reality is: he can't deliver. I never said the guy is dumb. He's just not good at planning and timing/reading the market. His biggest problem is all the PR preceding a release, really. Both Vega and Arc could have been small hits in specific segments, but no, it has to be the top GPU for everyone. This also echoes in design choices, in both cases, and in the amount of stuff that should be in there. If you look at the feature list for Arc, and GamersNexus coverage on it, it speaks volumes of that. And the poor delivery then really just proves the whole point. This was also the case with Vega - the majority could tweak the card better themselves, that's something in an age where GPUs boost fully automatically at competitors.

Every time the results are unrefined, hardware focused, blunt instruments. But GPU has been at a level where high levels of efficiency and refinement are key for a while now. We're only just now entering another era, seemingly, of more blunt instruments, scaling performance by adding power and die size - where two players have already had their refinement done or underway - and Intel is merely beginning at both.
Posted on Reply
#91
Squared
Maybe I don't understand business but selling Intel Arc doesn't make a lot of sense to me. Arc is built on the Xe microarchitecture which is also used in nearly every client CPU Intel makes. I don't see Intel selling the rights to use Xe technology since they are profiting from it.

Moreover Intel already spent the $3.5 billion; selling or killing Arc won't change that. Today Intel has Arc, at least in a lab somewhere. The question is: can Intel get a return on the dollars they invest after today?
Posted on Reply
#92
Vayra86
SquaredThe question is: can Intel get a return on the dollars they invest after today?
I think the fact we need to ask the question is enough reason for the JP article.

Thing is, usually when you bring a new tech into your product portfolio and you invest big on it, you have a long term plan of capturing markets with it and it has a decent chance of success - something calculated along numerous factors that are predictable. But if Intel can't take a meaningful bite of market share, what's the point? Is the market even big enough for them to recoup that money by the time they get to it?

And that comes down to the question: Can Intel catch up with the current start they've got? It means they'll have to run faster than two competitors that are in the race for decades and very good at it. This is not a disruptive sort of technology like ARM. This is 'another GPU on x86'. It is forced to play by the very same rules as everyone else making those products.

About Xe, yes its in every CPU, but they already had that before Xe. So they'll rebrand it back to Intel Iris or whatever.

I don't honestly see a single piece of Arc or Xe that is somehow unique, a design win or redefining the market in any way, and that's really the biggest problem. There isn't one reason you should really have that Intel GPU.
Posted on Reply
#93
lexluthermiester
SquaredMaybe I don't understand business but selling Intel Arc doesn't make a lot of sense to me. Arc is built on the Xe microarchitecture which is also used in nearly every client CPU Intel makes. I don't see Intel selling the rights to use Xe technology since they are profiting from it.
Then sit back and don't worry about. Intel will do their thing.
SquaredMoreover Intel already spent the $3.5 billion; selling or killing Arc won't change that. Today Intel has Arc, at least in a lab somewhere. The question is: can Intel get a return on the dollars they invest after today?
Investments do not turn a profit overnight. Ever. After the R&D is done, the building and marketing begins.



I love all the people that fail to understand a product launch of this scale takes time, money and effort. The Core Duo line of CPU's were amazing but they didn't take the world by storm. It was a process that took over 3 years before launch and even then it took over 6 months to build up. Ryzen was no different, neither was the RTX line of GPU's. It ALWAYS goes this way.
Posted on Reply
#94
eidairaman1
The Exiled Airman
Vayra86I think the fact we need to ask the question is enough reason for the JP article.

Thing is, usually when you bring a new tech into your product portfolio and you invest big on it, you have a long term plan of capturing markets with it and it has a decent chance of success - something calculated along numerous factors that are predictable. But if Intel can't take a meaningful bite of market share, what's the point? Is the market even big enough for them to recoup that money by the time they get to it?

And that comes down to the question: Can Intel catch up with the current start they've got? It means they'll have to run faster than two competitors that are in the race for decades and very good at it. This is not a disruptive sort of technology like ARM. This is 'another GPU on x86'. It is forced to play by the very same rules as everyone else making those products.

About Xe, yes its in every CPU, but they already had that before Xe. So they'll rebrand it back to Intel Iris or whatever.

I don't honestly see a single piece of Arc or Xe that is somehow unique, a design win or redefining the market in any way, and that's really the biggest problem. There isn't one reason you should really have that Intel GPU.
Reminds me of optane.
Posted on Reply
#95
DeathtoGnomes
lexluthermiesterThis is only common sense and to be expected.
The optimistic didnt expect it. /truestorybro.
:D
lexluthermiesterThen sit back and don't worry about. Intel will do their thing.
Yes they will and they wont sell popcorn for the fireworks that may be upcomming.
trparkyWell, ok. He may not be that incompetent fool, but you can, as @medi01 said, he doesn't know how to reign in expectations.
I dont think anyone claimed Raj was any good at management, IIRC he was more of a do'er than a leader.
lexluthermiesterI love all the people that fail to understand a product launch of this scale takes time, money and effort. The Core Duo line of CPU's were amazing but they didn't take the world by storm. It was a process that took over 3 years before launch and even then it took over 6 months to build up. Ryzen was no different, neither was the RTX line of GPU's. It ALWAYS goes this way.
Every new product introduction had/has its share of naysayers and schills, there is no getting around the fact that people just need to wait to see if something is accepted after it was proven worthy of opening the wallet.
Posted on Reply
#96
lexluthermiester
DeathtoGnomesThe optimistic didnt expect it. /truestorybro.
I did. I am actually surprised the figure wasn't higher. To me, this says Intel is doing things carefully and with a well thought out plan. Less than 5 Billion is nothing in the long-term for Intel. If the expenses(loss) had been 7 or 8 billion then yes, there would be reason for concern. Less than $5bn tells me they're on the right track and making solid progress.
DeathtoGnomesEvery new product introduction had/has its share of naysayers and schills, there is no getting around the fact that people just need to wait to see if something is accepted after it was proven worthy of opening the wallet.
100% agree! The proof is in the pudding, as the saying goes. Intel does have an uphill battle. They also seem to have a plan of action. My theory is now as it was before the Core2 line was released, they have something good cooking and that something is going to be a success. It will be very interesting to see how it all plays out. Time will tell.
Posted on Reply
#97
Palladium
lexluthermiesterThen sit back and don't worry about. Intel will do their thing.

Investments do not turn a profit overnight. Ever. After the R&D is done, the building and marketing begins.

I love all the people that fail to understand a product launch of this scale takes time, money and effort. The Core Duo line of CPU's were amazing but they didn't take the world by storm. It was a process that took over 3 years before launch and even then it took over 6 months to build up. Ryzen was no different, neither was the RTX line of GPU's. It ALWAYS goes this way.
Surprise, capitalists with short-termism "ANYTHING FOR A PROFIT NOW" brains are bad economists.
Posted on Reply
#98
Vayra86
eidairaman1Reminds me of optane.
Yeah and Optane did have a unique selling point even.
Posted on Reply
#99
Dirt Chip
Vayra86Yeah and Optane did have a unique selling point even.
You treat Intel's investment in pioneering tech as a bad thing and dn it's odd cus, well, this is their business...
They have the capital to do all those experiment and if you are in favor of tech advencment there is no acutel reason no to support them (not by buying stuff you dont need, just by word) for trying even if they fail.
Posted on Reply
#100
Vayra86
Dirt ChipYou treat Intel's investment in pioneering tech as a bad thing and dn it's odd cus, well, this is their business...
They have the capital to do all those experiment and if you are in favor of tech advencment there is no acutel reason no to support them (not by buying stuff you dont need, just by word) for trying even if they fail.
If Intel success or failure relies on moral/popular support and not its own merit, we're doing it wrong as customers and as tech enthusiasts IMHO.
Posted on Reply
Add your own comment
Jun 11th, 2024 13:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts