Friday, September 9th 2022

Intel Expects to Lose More Market Share, to Reconsider Exiting Other Businesses

During Evercore ISI TMT conference, Intel announced that the company would continue to lose market share, with a possible bounce back in the coming years. According to the latest report, Intel's CEO Pat Gelsinger announced that he expects the company to continue to lose its market share to AMD as the competition has "too much momentum" going for it. AMD's Ryzen and EPYC processors continue to deliver power and efficiency performance figures, which drives customers towards the company. On the other hand, Intel expects a competing product, especially in the data center business with Sapphire Rapids Xeon processors, set to arrive in 2023. Pat Gelsinger noted, "Competition just has too much momentum, and we haven't executed well enough. So we expect that bottoming. The business will be growing, but we do expect that there continues to be some share losses. We're not keeping up with the overall TAM growth until we get later into '25 and '26 when we start regaining share, material share gains."

The only down years that are supposed to show a toll of solid competition are 2022 and 2023. As far as creating a bounceback, Intel targets 2025 and 2026. "Now, obviously, in 2024, we think we're competitive. 2025, we think we're back to unquestioned leadership with our transistors and process technology," noted CEO Gelsinger. Additionally, he had a say about the emerging Arm CPUs competing for the same server market share as Intel and AMD do so, stating that "Well, when we deliver the Forest product line, we deliver power performance leadership versus all Arm alternatives, as well. So now you go to a cloud service provider, and you say, 'Well, why would I go through that butt ugly, heavy software lift to an ARM architecture versus continuing on the x86 family?"
Finally, Pat Gelsinger has emphasized that the company will continue to exit more businesses where it doesn't thrive. Just like it did with Optane memory, we could see Intel pulling out of other markets that don't necessarily align with the leadership's vision. Given that the CEO appointed a new leadership group and performed major company structure reforms, it is interesting to see what comes out of this. Just a few days ago, we saw the appointment of Shlomit Weiss as senior vice president and Co-GM of the Design Engineering Group.
Source: via Tom's Hardware
Add your own comment

68 Comments on Intel Expects to Lose More Market Share, to Reconsider Exiting Other Businesses

#26
Vayra86
andreiga76Pat said last year that Intel is back as leader, AMD had a good run but it is over, a year later AMD has too much momentum to be reached?
I'm guessing the new Intel CPUs won't beat Ryzen 7000 series.
big.LITTLE is basically good for two generations of 'stretching Core further to keep up'.

Dramatic TDP bumps still required, core count only increases on E cores, they're already pushed way beyond efficient curve... All I see here is a continuation of what Intel was doing since Sandy Bridge. It won't last and Pat knows it. And honestly, I don't think they're building better CPUs at all. They're different, more bursty, harder to cool, and effectively pushing laptop power technology to desktops because that's all they really have to make baby steps.

What I do know is that 3570k took no more than total package power (TDP on spec sheet) of 77W to OC it to 4.4 all core on 4 'p' cores.
Then I got an 8700K that would eat 120W to push 6 cores to 4.7 and had no OC headroom to speak of.
Now we're looking at triple that 3570k's power budget for 8 cores at 5.0~5.4 Ghz. That's 10+ years worth of Intel Core development for you... How much longer will Intel push the monolithic button before they admit defeat? And how silly are we as users for chasing those higher power bills?
Posted on Reply
#27
R0H1T
oxrufiioxoIt's kinda shocking a company as successful as intel has been over the last decade seems to be on the ropes.
I'd argue it's the exact opposite, they did virtually nothing last decade! In the 2k's post Conroe they had a good run till Sandy Bridge & then they thought milking users forever on 4 cores & changing their stupid motherboards each year was a winning strategy, so unless you're looking purely on the financials front they failed hard for almost a decade!
NanochipIntel’s major problem is execution.
Their biggest problem was greed ~ they were too greedy for way too long!
Posted on Reply
#28
watzupken
NanochipIntel’s major problem is execution. Sapphire rapids may be an interesting product from an engineering standpoint, but it is extremely late. It’s launch is anything but rapid.

Late delivery erodes customer trust. And even if intel technically does deliver a superior product, customers who switched to Epyc won’t instantaneously switch back. Gelsinger shouldn’t be so quick to assume this. For intel to start to gain market share again, it must deliver superior products on time and in volume. Genoa is looking mighty massive… when will intel catch up ?
I agree. The delays in their key products hurt them more than competition. Now that AMD and ARM have become viable alternatives, it just adds to Intel’s pain. In my opinion, if they continue this path of failing to deliver timely, it’s possible that they may never recover, unless competition trips badly as them.
Posted on Reply
#29
R0H1T
Recover how, like 90~95% of server market or similar numbers for desktop & possibly even higher for laptops at their peak? That'll never happen so Intel should plan their future accordingly!
Posted on Reply
#30
Vayra86
RogueSixI would also disagree with a previous poster who said that their GPU efforts are "rubbish". No. It actually makes sense because computing has long evolved beyond the pure CPU space. AMD and nVidia make a lot of money with GPUs outside of gaming. The datacenter revenue of nVidia last quarter surpassed gaming and was up +61% so massive growth in that sector. And mirroring Intel's GPU efforts, nVidia is working on a CPU (nVidia Grace) to complete their eco system.
Intel should have stuck with Larrabee and fixed it. They would be in a much better position in the datacenter space right now with a strong CPU + GPU computing package. Now they have to play catch-up game again but it is certainly not "rubbish". They need a strong GPU segment in order to compete in the long run (5+ years).
The one key consideration every time in semiconductor is that you need a long breath and dedication.

Larrabee, but also the 'one off' CPU designs that improved on the GPU aspect are all brainfarts with no dedication attached. That's the problem. And then we get ARC: a massive heap of misfires in project planning, totally bad assessment of what the market would want in a GPU and what to prioritize in its development, and just overall a completely failed launch; but not only that, it is also a GPU stack that currently holds absolutely nothing in terms of potential design wins. Its just another GPU trying to get all the metrics right, but behind on each and every one. Its not power efficient, it doesn't do certain things better, and it doesn't have shortcuts to becoming better.

If that's not 'rubbish', what is? Where is the snippet of Intel's history that should provide the confidence there is dedication this time? I'm not seeing it, other than more promises.

Its not like the competition is going to stop making faster GPUs either.
Posted on Reply
#31
oxrufiioxo
R0H1TI'd argue it's the exact opposite, they did virtually nothing last decade! In the 2k's post Conroe they had a good run till Sandy Bridge & then they thought milking users forever on 4 cores & changing their stupid motherboards each year was a winning strategy, so unless you're looking purely on the financials front they failed hard for almost a decade!
They still made some good products

I personally enjoyed all the intel cpu I owned in that time span 2500k, 4790k, 5820k, 6700K, 9900k I did briefly own a 10700k and 11500 but those were given away after I messed around with them for a couple weeks.... My 6700k system just recently got decommissioned after 7 years of service not bad really for a quad core.

The socket thing never bothered me back then a decent mobo wasn't nearly as expensive as it is today even my X99 board was cheap by todays standards I typically swap motherboard when upgrading the cpu regardless unless I'm using a cpu as a placeholder like I did with my 5800X (that i paid 50 usd for) till the 5950X dropped to msrp....

The 8700K and some of their lower end hedt options were also pretty great and even if 10th gen and 11th gen were meh it was still a miracle they could somewhat compete at all on an obviously inferior node.

I tend to buy whatever I find interesting nearly dropped 1500 usd on an Alder Lake platform cpu/mobo/ram just for fun. Glad I grabbed an Oled to use as a secondary media monitor instead.....
Posted on Reply
#32
ratirt
It makes me think, all the performance came with the silly frequency bumps which boosts power consumption too. Ecores are being added but this can't go forever. It would seem Intel is losing option what to do next. If the next two years are going to bring leadership for Intel it won't be due to monolithic die and ecores number increase. This can't go forever and I'm sure Intel knows this. I can only hope they will figure something better for the future of CPUs because I think, the ecores and ramping up the power consumption with frequency bumps is not going to cut it.
If AMD decides to go ecore as well, Intel won't stand a chance . They need something big very soon.
Posted on Reply
#33
Dirt Chip
Vayra86How much longer will Intel push the monolithic button before they admit defeat? And how silly are we as users for chasing those higher power bills?
Nothing is stopping intel from making their version of chiplet and I dont see how choosing so will count as 'defet'.
This word doesn't belong to describe tech strategis..

Anyway, thay can keep the Big.little thing plus chiplet ,if thay choose so, combind with tiels that come after RL.

And what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
Posted on Reply
#34
Nanochip
R0H1TI'd argue it's the exact opposite, they did virtually nothing last decade! In the 2k's post Conroe they had a good run till Sandy Bridge & then they thought milking users forever on 4 cores & changing their stupid motherboards each year was a winning strategy, so unless you're looking purely on the financials front they failed hard for almost a decade!

Their biggest problem was greed ~ they were too greedy for way too long!
Capitalism is inherently greedy… the capitalist’s greed entices him or her to produce goods and services in exchange for $$$.
Posted on Reply
#35
Wirko
This is also a signal to the other members of the cartel, who find it easy to decrypt: go ahead, up your prices now. (and don't worry, we'll follow when we can)
Posted on Reply
#36
R0H1T
NanochipCapitalism is inherently greedy… the capitalist’s greed entices him or her to produce goods and services in exchange for $$$.
Yes but it doesn't really stop them from innovating, do you remember one major innovation from Intel this last decade? Ok, I'll probably concede AVX & its various 'flavors' ~ what else?
Posted on Reply
#37
stimpy88
Did Intel run out of glue? I'm sure if they just keep adding e cores, they will keep up with AMD midrange products...

But in all seriousness, this is a sad day for the CPU market. I'm not a fan of Intel, but I'm also not going to enjoy AMD only competing with themselves, we have all seen how that will end!
Posted on Reply
#38
Wirko
Dirt ChipAnd what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
It tells us that all manufacturers have come close to the limits imposed by physics, and even closer to the limits imposed by the combination of physics and economics.
Posted on Reply
#39
Daven
WirkoIt tells us that all manufacturers have come close to the limits imposed by physics, and even closer to the limits imposed by the combination of physics and economics.
This is the best post so far. Semiconductors pushing electrons through copper wires and transistor gaps is at the end of its technological life.

Studies for my job in the laser industry has shown a huge increase in nanophotonics research. Its time for the age of the photon!
Posted on Reply
#40
R0H1T
Good luck with that & no as AMD, Nvidia, Apple have shown there's at least a decade left before we bid adieu to Si ~ Intel just needs to work harder! It's funny how people so easily forget Intel was leading AMD on the node front for more than a decade, since Conroe to zen3(2?) IIRC, sure they had superior uarch but clearly the physics played a big part in their success.
Posted on Reply
#41
1d10t
Where all of those triple budget and twice as many employee AMD has?
Posted on Reply
#43
Ravenas
Intel should exit Arc business. Shareholders are frustrated over Arc, and believe it will lose a tremendous amount of money before becoming profitable.

The United States of America needs Intel to be successful. Tables have turned, and they need to focus on core businesses: Server, Laptop, ARM, and desktop.
Posted on Reply
#45
Vayra86
Dirt ChipNothing is stopping intel from making their version of chiplet and I dont see how choosing so will count as 'defet'.
This word doesn't belong to describe tech strategis..

Anyway, thay can keep the Big.little thing plus chiplet ,if thay choose so, combind with tiels that come after RL.

And what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
What all of it says is that you need indeed the combination of chiplet and big little to even make a dent in CPU performance cap at any kind of decent TDP.

The low hanging fruit is gone, or, one could say, you need all of it in a basket to make something that truly excels. The progress on CPU now is stretching up the ceiling ever further to make a longer bar of bench points.

Defeat however it shall be. Intel stuck with its old strategy for far too long and they've been forced to adapt several times already. That's defeat - they had a strategic outlook and it completely misfired. The same thing happened with Optane. It happened with first gen Arc and Xe. And it happened by sitting on quad core for far too long, and then by sitting on 10nm for far too long. Had enough? Hang on! They also missed the ARM train and still are - the biggest problem going forward for them.

I know tech companies don't like to call this defeat, because they're always winning in their board rooms, but really, some well paid big ego's made some pretty gruesome and long lasting mistakes here. The fact that an underdog can come out swinging like AMD did is a testament to that.

I certainly hope Pat's got some pretty big aces up his sleeve for 2025.


If we want a live example of a tech company that seems to have learned the lesson in tech 'you can't ever stop moving ahead of the pack', you need to be looking at Nvidia. Whatever we think of their methods, the fact is, every year they have something that raises eyebrows. Every time they seem to push and stretch things a bit further, and that's not just regarding their architectural improvements. They're on the ball in every aspect of their product portfolio: software most importantly. They know they're not selling a GPU, they're selling a thing people do with it, and they're actively contributing to those worlds.

Intel historically has done very little of that, and wherever they DID show such dedication, they gained and kept market share. For CPU that would be things like AVX, but also the close relationship with their supply chain and making sure product gets Intel inside. Again, whatever you think of their methods, this kind of dedication is what makes any piece of hardware shine - their laptop dedication for example is also the reason we've got those bursty though pretty efficient consumer CPUs. A trait AMD hasn't been able to push out on Ryzen quite as well just yet.

To circle back to the opening of this post: I'm looking forward to Intel and AMD stealing each other's tech and making a real step forward. Chiplet based core complexes of different kinds? GO. It is then, and only then that software can be tailored to make best use of big little and the properties of chiplet.
Posted on Reply
#46
ThrashZone
Dirt ChipAnd what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
Hi,
Not much more to say the article says exactly why nobody is impressed anymore by simply adding more voltage :cool:
Intel's CEO Pat Gelsinger announced that he expects the company to continue to lose its market share to AMD as the competition has "too much momentum" going for it. AMD's Ryzen and EPYC processors continue to deliver power and efficiency performance figures, which drives customers towards the company.
Posted on Reply
#47
AM4isGOD
PapaTaipeiSo many experts her! ;)
everyone is a micro architecture expert here

I'd love to see what some of them could come up with.
Posted on Reply
#48
R0H1T
Certainly better than those dumba**** who blew such tremendous lead over AMD in less than half a decade! Just 3 years in fact ~ they had it in them but deliberately chose to suck the life out of their long paying customers. How do you explain the 4c-10c top end MSDT on the same 14nm node then, 7700k-10900k in the time Ryzen launched :rolleyes:

And why the heck do you need 4 motherboards to go from 4c-10c on the same uarch :slap:
Posted on Reply
#49
MarsM4N
"Intel's CEO Pat Gelsinger announced that he expects the company to continue to lose its market share to AMD as the competition has "too much momentum" going for it. AMD's Ryzen and EPYC processors continue to deliver power and efficiency performance figures, which drives customers towards the company."

Dam, that's like admitting defeat! Not even blaming the Covid pandemic issues, Russian war or inflation/recession for it. o_O He must be coma drinking after the AMD Zen4 Reveal Event, lol.

Well, we saw the impacts comming closer & closer. 3D V-Cache, continuing process node enhancements & better power/performance. And the Zen4 3Dc V-Cache models aren't even out yet.
Absolutely sounds Intel has to watch from the sidelines how Zen4 will dominate the gaming desktop market.
Posted on Reply
#50
TheoneandonlyMrK
Vayra86big.LITTLE is basically good for two generations of 'stretching Core further to keep up'.

Dramatic TDP bumps still required, core count only increases on E cores, they're already pushed way beyond efficient curve... All I see here is a continuation of what Intel was doing since Sandy Bridge. It won't last and Pat knows it.
Exactly this.

But also, I think the chip shortage has exacerbated everyone's issues, a little known thing that's happening is chip brokerage firms, they have exploded in size and scope and are now fully exploitative.

In one case I Know of a roll of chip's that used to cost £300 literally costs a million now.

These aren't modern chip's so have a scarcity tbf but still.

I think the fact stuff is costing what it is has directly driven many to Make do, and affected the bottom line of Intel and others.

Hopefully they can hold out, and pull some stuff on they're fluff because they're PR is starting to wear thin on some line's (Arc)
Posted on Reply
Add your own comment
Dec 18th, 2024 08:57 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts