• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Expects to Lose More Market Share, to Reconsider Exiting Other Businesses

what a inept and continue gain indecent salary


:)
Intel's current position isn't Gelsinger's fault. It'll take years for his changes in strategy to have any effect, just as it took Lisa Su years to turn AMD around. Brian Krzanich is the one that allowed the rot to set in at Intel. Of course, he was also paid an extortionate amount whilst doing so, including a $50m golden handshake when he stepped down.
 
Pat said last year that Intel is back as leader, AMD had a good run but it is over, a year later AMD has too much momentum to be reached?
I'm guessing the new Intel CPUs won't beat Ryzen 7000 series.
big.LITTLE is basically good for two generations of 'stretching Core further to keep up'.

Dramatic TDP bumps still required, core count only increases on E cores, they're already pushed way beyond efficient curve... All I see here is a continuation of what Intel was doing since Sandy Bridge. It won't last and Pat knows it. And honestly, I don't think they're building better CPUs at all. They're different, more bursty, harder to cool, and effectively pushing laptop power technology to desktops because that's all they really have to make baby steps.

What I do know is that 3570k took no more than total package power (TDP on spec sheet) of 77W to OC it to 4.4 all core on 4 'p' cores.
Then I got an 8700K that would eat 120W to push 6 cores to 4.7 and had no OC headroom to speak of.
Now we're looking at triple that 3570k's power budget for 8 cores at 5.0~5.4 Ghz. That's 10+ years worth of Intel Core development for you... How much longer will Intel push the monolithic button before they admit defeat? And how silly are we as users for chasing those higher power bills?
 
Last edited:
It's kinda shocking a company as successful as intel has been over the last decade seems to be on the ropes.
I'd argue it's the exact opposite, they did virtually nothing last decade! In the 2k's post Conroe they had a good run till Sandy Bridge & then they thought milking users forever on 4 cores & changing their stupid motherboards each year was a winning strategy, so unless you're looking purely on the financials front they failed hard for almost a decade!
Intel’s major problem is execution.
Their biggest problem was greed ~ they were too greedy for way too long!
 
Intel’s major problem is execution. Sapphire rapids may be an interesting product from an engineering standpoint, but it is extremely late. It’s launch is anything but rapid.

Late delivery erodes customer trust. And even if intel technically does deliver a superior product, customers who switched to Epyc won’t instantaneously switch back. Gelsinger shouldn’t be so quick to assume this. For intel to start to gain market share again, it must deliver superior products on time and in volume. Genoa is looking mighty massive… when will intel catch up ?
I agree. The delays in their key products hurt them more than competition. Now that AMD and ARM have become viable alternatives, it just adds to Intel’s pain. In my opinion, if they continue this path of failing to deliver timely, it’s possible that they may never recover, unless competition trips badly as them.
 
Recover how, like 90~95% of server market or similar numbers for desktop & possibly even higher for laptops at their peak? That'll never happen so Intel should plan their future accordingly!
 
I would also disagree with a previous poster who said that their GPU efforts are "rubbish". No. It actually makes sense because computing has long evolved beyond the pure CPU space. AMD and nVidia make a lot of money with GPUs outside of gaming. The datacenter revenue of nVidia last quarter surpassed gaming and was up +61% so massive growth in that sector. And mirroring Intel's GPU efforts, nVidia is working on a CPU (nVidia Grace) to complete their eco system.
Intel should have stuck with Larrabee and fixed it. They would be in a much better position in the datacenter space right now with a strong CPU + GPU computing package. Now they have to play catch-up game again but it is certainly not "rubbish". They need a strong GPU segment in order to compete in the long run (5+ years).
The one key consideration every time in semiconductor is that you need a long breath and dedication.

Larrabee, but also the 'one off' CPU designs that improved on the GPU aspect are all brainfarts with no dedication attached. That's the problem. And then we get ARC: a massive heap of misfires in project planning, totally bad assessment of what the market would want in a GPU and what to prioritize in its development, and just overall a completely failed launch; but not only that, it is also a GPU stack that currently holds absolutely nothing in terms of potential design wins. Its just another GPU trying to get all the metrics right, but behind on each and every one. Its not power efficient, it doesn't do certain things better, and it doesn't have shortcuts to becoming better.

If that's not 'rubbish', what is? Where is the snippet of Intel's history that should provide the confidence there is dedication this time? I'm not seeing it, other than more promises.

Its not like the competition is going to stop making faster GPUs either.
 
I'd argue it's the exact opposite, they did virtually nothing last decade! In the 2k's post Conroe they had a good run till Sandy Bridge & then they thought milking users forever on 4 cores & changing their stupid motherboards each year was a winning strategy, so unless you're looking purely on the financials front they failed hard for almost a decade!

They still made some good products

I personally enjoyed all the intel cpu I owned in that time span 2500k, 4790k, 5820k, 6700K, 9900k I did briefly own a 10700k and 11500 but those were given away after I messed around with them for a couple weeks.... My 6700k system just recently got decommissioned after 7 years of service not bad really for a quad core.

The socket thing never bothered me back then a decent mobo wasn't nearly as expensive as it is today even my X99 board was cheap by todays standards I typically swap motherboard when upgrading the cpu regardless unless I'm using a cpu as a placeholder like I did with my 5800X (that i paid 50 usd for) till the 5950X dropped to msrp....

The 8700K and some of their lower end hedt options were also pretty great and even if 10th gen and 11th gen were meh it was still a miracle they could somewhat compete at all on an obviously inferior node.

I tend to buy whatever I find interesting nearly dropped 1500 usd on an Alder Lake platform cpu/mobo/ram just for fun. Glad I grabbed an Oled to use as a secondary media monitor instead.....
 
It makes me think, all the performance came with the silly frequency bumps which boosts power consumption too. Ecores are being added but this can't go forever. It would seem Intel is losing option what to do next. If the next two years are going to bring leadership for Intel it won't be due to monolithic die and ecores number increase. This can't go forever and I'm sure Intel knows this. I can only hope they will figure something better for the future of CPUs because I think, the ecores and ramping up the power consumption with frequency bumps is not going to cut it.
If AMD decides to go ecore as well, Intel won't stand a chance . They need something big very soon.
 
How much longer will Intel push the monolithic button before they admit defeat? And how silly are we as users for chasing those higher power bills?
Nothing is stopping intel from making their version of chiplet and I dont see how choosing so will count as 'defet'.
This word doesn't belong to describe tech strategis..

Anyway, thay can keep the Big.little thing plus chiplet ,if thay choose so, combind with tiels that come after RL.

And what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
 
I'd argue it's the exact opposite, they did virtually nothing last decade! In the 2k's post Conroe they had a good run till Sandy Bridge & then they thought milking users forever on 4 cores & changing their stupid motherboards each year was a winning strategy, so unless you're looking purely on the financials front they failed hard for almost a decade!

Their biggest problem was greed ~ they were too greedy for way too long!
Capitalism is inherently greedy… the capitalist’s greed entices him or her to produce goods and services in exchange for $$$.
 
This is also a signal to the other members of the cartel, who find it easy to decrypt: go ahead, up your prices now. (and don't worry, we'll follow when we can)
 
Capitalism is inherently greedy… the capitalist’s greed entices him or her to produce goods and services in exchange for $$$.
Yes but it doesn't really stop them from innovating, do you remember one major innovation from Intel this last decade? Ok, I'll probably concede AVX & its various 'flavors' ~ what else?
 
Did Intel run out of glue? I'm sure if they just keep adding e cores, they will keep up with AMD midrange products...

But in all seriousness, this is a sad day for the CPU market. I'm not a fan of Intel, but I'm also not going to enjoy AMD only competing with themselves, we have all seen how that will end!
 
And what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
It tells us that all manufacturers have come close to the limits imposed by physics, and even closer to the limits imposed by the combination of physics and economics.
 
It tells us that all manufacturers have come close to the limits imposed by physics, and even closer to the limits imposed by the combination of physics and economics.
This is the best post so far. Semiconductors pushing electrons through copper wires and transistor gaps is at the end of its technological life.

Studies for my job in the laser industry has shown a huge increase in nanophotonics research. Its time for the age of the photon!
 
Good luck with that & no as AMD, Nvidia, Apple have shown there's at least a decade left before we bid adieu to Si ~ Intel just needs to work harder! It's funny how people so easily forget Intel was leading AMD on the node front for more than a decade, since Conroe to zen3(2?) IIRC, sure they had superior uarch but clearly the physics played a big part in their success.
 
Where all of those triple budget and twice as many employee AMD has?
 
Intel should exit Arc business. Shareholders are frustrated over Arc, and believe it will lose a tremendous amount of money before becoming profitable.

The United States of America needs Intel to be successful. Tables have turned, and they need to focus on core businesses: Server, Laptop, ARM, and desktop.
 
Hi,
To many lakes is why.
 
Nothing is stopping intel from making their version of chiplet and I dont see how choosing so will count as 'defet'.
This word doesn't belong to describe tech strategis..

Anyway, thay can keep the Big.little thing plus chiplet ,if thay choose so, combind with tiels that come after RL.

And what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
What all of it says is that you need indeed the combination of chiplet and big little to even make a dent in CPU performance cap at any kind of decent TDP.

The low hanging fruit is gone, or, one could say, you need all of it in a basket to make something that truly excels. The progress on CPU now is stretching up the ceiling ever further to make a longer bar of bench points.

Defeat however it shall be. Intel stuck with its old strategy for far too long and they've been forced to adapt several times already. That's defeat - they had a strategic outlook and it completely misfired. The same thing happened with Optane. It happened with first gen Arc and Xe. And it happened by sitting on quad core for far too long, and then by sitting on 10nm for far too long. Had enough? Hang on! They also missed the ARM train and still are - the biggest problem going forward for them.

I know tech companies don't like to call this defeat, because they're always winning in their board rooms, but really, some well paid big ego's made some pretty gruesome and long lasting mistakes here. The fact that an underdog can come out swinging like AMD did is a testament to that.

I certainly hope Pat's got some pretty big aces up his sleeve for 2025.


If we want a live example of a tech company that seems to have learned the lesson in tech 'you can't ever stop moving ahead of the pack', you need to be looking at Nvidia. Whatever we think of their methods, the fact is, every year they have something that raises eyebrows. Every time they seem to push and stretch things a bit further, and that's not just regarding their architectural improvements. They're on the ball in every aspect of their product portfolio: software most importantly. They know they're not selling a GPU, they're selling a thing people do with it, and they're actively contributing to those worlds.

Intel historically has done very little of that, and wherever they DID show such dedication, they gained and kept market share. For CPU that would be things like AVX, but also the close relationship with their supply chain and making sure product gets Intel inside. Again, whatever you think of their methods, this kind of dedication is what makes any piece of hardware shine - their laptop dedication for example is also the reason we've got those bursty though pretty efficient consumer CPUs. A trait AMD hasn't been able to push out on Ryzen quite as well just yet.

To circle back to the opening of this post: I'm looking forward to Intel and AMD stealing each other's tech and making a real step forward. Chiplet based core complexes of different kinds? GO. It is then, and only then that software can be tailored to make best use of big little and the properties of chiplet.
 
Last edited:
And what dose it say about AMD chiplet design if it can only match (give or take) and not totaly win intel Big.little?
Hi,
Not much more to say the article says exactly why nobody is impressed anymore by simply adding more voltage :cool:
Intel's CEO Pat Gelsinger announced that he expects the company to continue to lose its market share to AMD as the competition has "too much momentum" going for it. AMD's Ryzen and EPYC processors continue to deliver power and efficiency performance figures, which drives customers towards the company.
 
Certainly better than those dumba**** who blew such tremendous lead over AMD in less than half a decade! Just 3 years in fact ~ they had it in them but deliberately chose to suck the life out of their long paying customers. How do you explain the 4c-10c top end MSDT on the same 14nm node then, 7700k-10900k in the time Ryzen launched :rolleyes:

And why the heck do you need 4 motherboards to go from 4c-10c on the same uarch :slap:
 
"Intel's CEO Pat Gelsinger announced that he expects the company to continue to lose its market share to AMD as the competition has "too much momentum" going for it. AMD's Ryzen and EPYC processors continue to deliver power and efficiency performance figures, which drives customers towards the company."

Dam, that's like admitting defeat! Not even blaming the Covid pandemic issues, Russian war or inflation/recession for it. o_O He must be coma drinking after the AMD Zen4 Reveal Event, lol.

Well, we saw the impacts comming closer & closer. 3D V-Cache, continuing process node enhancements & better power/performance. And the Zen4 3Dc V-Cache models aren't even out yet.
Absolutely sounds Intel has to watch from the sidelines how Zen4 will dominate the gaming desktop market.
 
Last edited by a moderator:
Back
Top