• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel GPU Business in a $3.5 Billion Hole, Jon Peddie Recommends Sell or Kill

Yeah, but to be fair to Raja, GPU division was severely underfunded in his times.

I generally dislike the hate Raja gets, its pretty much never 1 person, especially not on this scale, sure its easier for the mind to deal with that it would be....but it isnt.
Look at the star wars prequels, that is what happens when you think its all 1 person responsible.....
 
I generally dislike the hate Raja gets, its pretty much never 1 person, especially not on this scale, sure its easier for the mind to deal with that it would be....but it isnt.
In terms of performance of a product developed - agreed.

But in terms of embarassing statements made - it is absolutely solely Raja. It stopped the second he left AMD at AMD and it started to pop out of Intel once got active there.
 
'dun like Intel.
'dun like Raja.
All that said:
Give them a friggin chance; jeeze.
Intel only 'gave up' on Optane after several years and product lines 'not panning out'.
AFAIK, there is no Intel Alchemist release in the 'States at all, yet. If they hang back out of our market for the first gen, okay. Let's see how Gen 2 goes.
 
Seems like RIPing Intel's ARC way too soon.
It's like looking at a new factory with 10 billion in investment and dicidin day 1 that it's not profitable. But any HW investment isn't profitable day 1 (or 2 month in this case), it takes time, be patient (or don't, and keep wabble about how much intel sucks- all is good).

They don't have a cash problem so I see no point in ending something that haven't really started yet.
Wait a few months and see the driver progress.
 
AMD took a giant risk in buying ATI & everyone claimed AMD would go bankrput. Here we are 16 year later AMD has high-end gpu's & everyone is forgetting that AMD had to settle for making mid-range graphics cards for a very long time. In the long run Intel better be ok with more than a 3.5 billion dollar drop cause for AMD it was 5.4 Billion dollars to lose + the other Losses.
Thing is ATI kept AMD afloat, their CPUs were couldn't compete with Intel's (except maybe in some special situations), but the GPUs were right up there with nVidia.
I would say buying ATI saved them.
 
Yeah, but to be fair to Raja, GPU division was severely underfunded in his times.
yeah true, but at Intel the division should have been overfunded and still underdelivered (tm?)

tbf, i do not even dislike Raja and i want Intel to succeed (finally) at discreet graphic ... but for now ... i am heavily underwhelmed :ohwell:
 
Low quality post by WeeRab
Incredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
 
Incredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
Some of us are successful business owners and actually do have a lot of experience in this industry. So there is that...:rolleyes:
 
Thing is ATI kept AMD afloat, their CPUs were couldn't compete with Intel's (except maybe in some special situations), but the GPUs were right up there with nVidia.
I would say buying ATI saved them.
Buying ATI put AMD in massive debt, millions of dollars that should have been put into AMD CPU design instead. Debt that haunted AMD for over 15 years.

You cant take credit for fixing an issue when you've directly caused this very issue.
 
Incredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
My armchair has so much top level expertise it lets me travel at the speed of light. To be fair, it doesnt take a rocket scientist to understand how a rocket works. Some people like to know specific details, not everyone likes that same details so doesnt bother trying to learn the same details as the next guy. We gather here to share what we know. Period.

/ontopic

It was predicted that intel would take a huge loss before a respectable competiticve card was brought to market. I thought that number would be in billions, may even double digits. That depends on how persistent Intel want to be and how much abuse Intel will take from shareholders before giving up. This is just the first instance of some entity suggesting Intel jump off a long pier into a deep lake.
 
Incredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
Maybe you ought to revisit the definition of what a forum is supposed to be.

;)

And next, consider how your post adds anything to it.
 
Low quality post by eidairaman1
I agree, better sell the IPs to someone else (maybe AMD, to help them in streaming and encoding). Intel has been doing GPU for ages, no excuse for their failure with Arc. Maybe they could've focused on the low end, launching something better than a 1030 for cheap and go up from there, targeting the 3070 was very ambitious.
With ATI/AMD and nVidia we had a lot of competition graphics cards launched nearly every 6 months, it's only the last few generations that AMD struggled a bit, but since RDNA AMD is back (ignoring nVidia fanboys especially among reviewers) we will have competition again.
Wtf would AMD want something raja koduris hands were on!? He was fired after the vega fiasco.

Incredible. Who would've thought that TPU had so many tech/finance experts among their readership? I guess you must all be billionaires, with decades of top-level expertise in semiconductor design and manufacture.
I've learned so much, in such a short time listening to you experts. Thanks /s.
And you proved to be a donkey
 
Raja is not the incompetent fool some seem to think he is.
Well, ok. He may not be that incompetent fool, but you can, as @medi01 said, he doesn't know how to reign in expectations. Hence this quote...
But in terms of embarassing statements made - it is absolutely solely Raja. It stopped the second he left AMD at AMD and it started to pop out of Intel once got active there.
The guy just doesn't know how to set expectations. Always underpromise, always. And then when it overdelivers, great; people will think of you as a miracle worker. But if you overpromise and then underdeliver as we've recently seen, then you get this kind of shit.
 
Buying ATI put AMD in massive debt, millions of dollars that should have been put into AMD CPU design instead. Debt that haunted AMD for over 15 years.

You cant take credit for fixing an issue when you've directly caused this very issue.
Given the bribing of Intel I don't think it would've helped to have a better CPU, Athlons were way better than Pentium 4 yet AMD struggled to gain market share.
 
It basically took AMD 7 years between the Radeon 7000 series and RDNA2 (not coincidentally their PS4 and PS5 architecture) to fix their graphics division. Intel needs to be committed to 7 years of products also.

It took more than 7 years from Bulldozer to Ryzen. Intel needs to be in it for the long haul. I don't think the fact their first product is bad is that surprising.
Its definitely true for the Bulldozer to Ryzen, they took a huge beating from betting early that more cores = better and made the cores too weak to compete in the high end market (Although I am ignoring some of the blatant anti-compete stuff that happened in this time frame as well).

As for the GPU's, the first generation R series (R9 290X, 290, 280X etc) was a great series. They really struggled after that and made poor investment choices in HBM early which was expensive and made it hard to put more than 4gb HBM on the first generation making them inferior in many aspects to Nvidia's offerings. It took years but they are finally competitive again on the GPU front.

Honestly, Intel needs to double down and invest in their GPU's. Not just for consumers to have alternate options, but because they are going to need some alternative revenue streams. Not to mention at this point cutting it off would be a waste and not only damage their reputation, but have them write off a huge amount of money.
 
This guy was also responsible for HDR gaming that we have today. Anyway that's not the point, this was his first attempt at Intel and secondly at AMD they were budget strapped so he has always been taking job against all odds.
LOL. Yeah. That tech that barely works on 95% of all monitors on the market.

Fan-tas-tic. Against all odds indeed - now, that's my point as well. Apparently he knows how to present significant risk as a good story, and that results in overpromise/underdeliver. And part of that reality is: he can't deliver. I never said the guy is dumb. He's just not good at planning and timing/reading the market. His biggest problem is all the PR preceding a release, really. Both Vega and Arc could have been small hits in specific segments, but no, it has to be the top GPU for everyone. This also echoes in design choices, in both cases, and in the amount of stuff that should be in there. If you look at the feature list for Arc, and GamersNexus coverage on it, it speaks volumes of that. And the poor delivery then really just proves the whole point. This was also the case with Vega - the majority could tweak the card better themselves, that's something in an age where GPUs boost fully automatically at competitors.

Every time the results are unrefined, hardware focused, blunt instruments. But GPU has been at a level where high levels of efficiency and refinement are key for a while now. We're only just now entering another era, seemingly, of more blunt instruments, scaling performance by adding power and die size - where two players have already had their refinement done or underway - and Intel is merely beginning at both.
 
Last edited:
Maybe I don't understand business but selling Intel Arc doesn't make a lot of sense to me. Arc is built on the Xe microarchitecture which is also used in nearly every client CPU Intel makes. I don't see Intel selling the rights to use Xe technology since they are profiting from it.

Moreover Intel already spent the $3.5 billion; selling or killing Arc won't change that. Today Intel has Arc, at least in a lab somewhere. The question is: can Intel get a return on the dollars they invest after today?
 
The question is: can Intel get a return on the dollars they invest after today?
I think the fact we need to ask the question is enough reason for the JP article.

Thing is, usually when you bring a new tech into your product portfolio and you invest big on it, you have a long term plan of capturing markets with it and it has a decent chance of success - something calculated along numerous factors that are predictable. But if Intel can't take a meaningful bite of market share, what's the point? Is the market even big enough for them to recoup that money by the time they get to it?

And that comes down to the question: Can Intel catch up with the current start they've got? It means they'll have to run faster than two competitors that are in the race for decades and very good at it. This is not a disruptive sort of technology like ARM. This is 'another GPU on x86'. It is forced to play by the very same rules as everyone else making those products.

About Xe, yes its in every CPU, but they already had that before Xe. So they'll rebrand it back to Intel Iris or whatever.

I don't honestly see a single piece of Arc or Xe that is somehow unique, a design win or redefining the market in any way, and that's really the biggest problem. There isn't one reason you should really have that Intel GPU.
 
Last edited:
Maybe I don't understand business but selling Intel Arc doesn't make a lot of sense to me. Arc is built on the Xe microarchitecture which is also used in nearly every client CPU Intel makes. I don't see Intel selling the rights to use Xe technology since they are profiting from it.
Then sit back and don't worry about. Intel will do their thing.
Moreover Intel already spent the $3.5 billion; selling or killing Arc won't change that. Today Intel has Arc, at least in a lab somewhere. The question is: can Intel get a return on the dollars they invest after today?
Investments do not turn a profit overnight. Ever. After the R&D is done, the building and marketing begins.



I love all the people that fail to understand a product launch of this scale takes time, money and effort. The Core Duo line of CPU's were amazing but they didn't take the world by storm. It was a process that took over 3 years before launch and even then it took over 6 months to build up. Ryzen was no different, neither was the RTX line of GPU's. It ALWAYS goes this way.
 
I think the fact we need to ask the question is enough reason for the JP article.

Thing is, usually when you bring a new tech into your product portfolio and you invest big on it, you have a long term plan of capturing markets with it and it has a decent chance of success - something calculated along numerous factors that are predictable. But if Intel can't take a meaningful bite of market share, what's the point? Is the market even big enough for them to recoup that money by the time they get to it?

And that comes down to the question: Can Intel catch up with the current start they've got? It means they'll have to run faster than two competitors that are in the race for decades and very good at it. This is not a disruptive sort of technology like ARM. This is 'another GPU on x86'. It is forced to play by the very same rules as everyone else making those products.

About Xe, yes its in every CPU, but they already had that before Xe. So they'll rebrand it back to Intel Iris or whatever.

I don't honestly see a single piece of Arc or Xe that is somehow unique, a design win or redefining the market in any way, and that's really the biggest problem. There isn't one reason you should really have that Intel GPU.
Reminds me of optane.
 
This is only common sense and to be expected.
The optimistic didnt expect it. /truestorybro.
:D

Then sit back and don't worry about. Intel will do their thing.
Yes they will and they wont sell popcorn for the fireworks that may be upcomming.

Well, ok. He may not be that incompetent fool, but you can, as @medi01 said, he doesn't know how to reign in expectations.
I dont think anyone claimed Raj was any good at management, IIRC he was more of a do'er than a leader.

I love all the people that fail to understand a product launch of this scale takes time, money and effort. The Core Duo line of CPU's were amazing but they didn't take the world by storm. It was a process that took over 3 years before launch and even then it took over 6 months to build up. Ryzen was no different, neither was the RTX line of GPU's. It ALWAYS goes this way.
Every new product introduction had/has its share of naysayers and schills, there is no getting around the fact that people just need to wait to see if something is accepted after it was proven worthy of opening the wallet.
 
The optimistic didnt expect it. /truestorybro.
I did. I am actually surprised the figure wasn't higher. To me, this says Intel is doing things carefully and with a well thought out plan. Less than 5 Billion is nothing in the long-term for Intel. If the expenses(loss) had been 7 or 8 billion then yes, there would be reason for concern. Less than $5bn tells me they're on the right track and making solid progress.
Every new product introduction had/has its share of naysayers and schills, there is no getting around the fact that people just need to wait to see if something is accepted after it was proven worthy of opening the wallet.
100% agree! The proof is in the pudding, as the saying goes. Intel does have an uphill battle. They also seem to have a plan of action. My theory is now as it was before the Core2 line was released, they have something good cooking and that something is going to be a success. It will be very interesting to see how it all plays out. Time will tell.
 
Back
Top