Tuesday, January 14th 2025

Intel 12th Gen "Alder Lake" Mobile CPUs Face Retirement, HX-series Spared

Intel product change notification documents—published on January 6—have revealed the planned "End of Life" (EOL) phasing out of 12th Generation "Alder Lake" mobile processor models. Tom's Hardware has pored over the listed products/SKUs and concluded that the vast majority of Team Blue's mobile-oriented Alder Lake selection are destined for retirement. Team Blue's HX series is being kept alive for a little while longer. Two documents show differing "discontinuance timelines" for their respective inventories—including lower-end Celeron and Pentium Gold SKUs, as well as familiar higher-up Core i3, i5, i7, and i9 families. U, P, H and HK-affixed models are lined up for the chopping block.

Intel's 13th Generation "Raptor Lake" mobile processor selection—comprised of Core 100 (series 1) and Core 200 (series 2)—offers similar silicon makeup. Many equivalent alternatives to older generation "Alder Lake" chips reside here—Tom's Hardware presented a key example: "i5-1235U, which is designated for thin and lightweight laptops. OEMs can instead opt for the i5-1335U, the Core 5 120U, or the Core 5 220U, as they're just better bins of the 1235U on the same FCBGA1744 socket." A significant number of Alder Lake mobile SKUs will be available to OEMs for ordering up until 26 April, with final shipments heading out on 25 October. The rest have been assigned a July 25 order cut-off date, with final shipments scheduled on 26 January 2026.
Source: Tom's Hardware
Add your own comment

19 Comments on Intel 12th Gen "Alder Lake" Mobile CPUs Face Retirement, HX-series Spared

#1
user556
Meanwhile, the desktop retail 12400F (No functioning E-cores) is still the top selling Intel part - The only Intel model to consistently land somewhere in the top ten sales. As it has been for multiple years now.
Posted on Reply
#2
freeagent
I was just thinking, I mean it makes sense for them to do it, but at the same time why take away the good chips and leave the defective one on the market?

Unless not all of them were defective, but I was under the impression that they were all defective, 13th and 14th gens.
Posted on Reply
#3
JcRabbit
freeagentI was just thinking, I mean it makes sense for them to do it, but at the same time why take away the good chips and leave the defective one on the market?

Unless not all of them were defective, but I was under the impression that they were all defective, 13th and 14th gens.
You answered your own question - makes sense for them to do it. Take away all the good chips, people are left with no option but to buy the bad chips (or none at all and go AMD). And yes, all 13th and 14th gen chips are prone do degradation - I would not be surprised if years from now we learn that the latest microcode updates only delayed the inevitable, not prevented it.
Posted on Reply
#4
bug
Intel 12th Gen "Alder Lake" Mobile CPUs Face Retirement
A significant number of Alder Lake mobile SKUs will be available to OEMs for ordering up until 26 April, with final shipments heading out on 25 October. The rest have been assigned a July 25 order cut-off date, with final shipments scheduled on 26 January 2026.
"Face retirement" now means being available for at least another year. Noted.
Posted on Reply
#5
freeagent
JcRabbitYou answered your own question - makes sense for them to do it.
I actually didn't, I misread the thread title.
Posted on Reply
#6
Vayra86
JcRabbitYou answered your own question - makes sense for them to do it. Take away all the good chips, people are left with no option but to buy the bad chips (or none at all and go AMD). And yes, all 13th and 14th gen chips are prone do degradation - I would not be surprised if years from now we learn that the latest microcode updates only delayed the inevitable, not prevented it.
Yep. I'm not touching any Intel CPU for the next five years at least. They're all suspect. There is no conclusive proof. Not even a new node and product is real proof. The only possible proof is simply building trust this never happens again, an affair that simply takes time. The market, the node and the situation for Intel isn't getting easier either, so that will certainly bring new dilemma's to their table.
Posted on Reply
#7
Wirko
Vayra86Yep. I'm not touching any Intel CPU for the next five years at least. They're all suspect. There is no conclusive proof. Not even a new node and product is real proof. The only possible proof is simply building trust this never happens again, an affair that simply takes time. The market, the node and the situation for Intel isn't getting easier either, so that will certainly bring new dilemma's to their table.
At least Intel put a lot of effort into fixing what can be fixed, I genuinely believe that. I'd only avoid buying their highest-clocked, highest-volted processors right now, and then I'd rather not overclock them.

But there's a certain probability that all chip manufacturers, not just Intel, are about to encounter chip degradation issues at 3 nm and below. We'll see (but not soon enough).
Posted on Reply
#8
trparky
Vayra86Yep. I'm not touching any Intel CPU for the next five years at least. They're all suspect. There is no conclusive proof. Not even a new node and product is real proof. The only possible proof is simply building trust this never happens again, an affair that simply takes time. The market, the node and the situation for Intel isn't getting easier either, so that will certainly bring new dilemma's to their table.
Me too. I'm going strictly AMD for processors for the foreseeable future.
Posted on Reply
#9
NoLoihi
freeagentI was just thinking, I mean it makes sense for them to do it, but at the same time why take away the good chips and leave the defective one on the market?

Unless not all of them were defective, but I was under the impression that they were all defective, 13th and 14th gens.
Uhm, there were multiple problems and all of them have been brought in line. If failure rates were still above base rates for any product in that market, we’d probably have heard about it by now and … I guess they would also open themselves up to a lawsuit?
JcRabbitYou answered your own question - makes sense for them to do it. Take away all the good chips, people are left with no option but to buy the bad chips (or none at all and go AMD). And yes, all 13th and 14th gen chips are prone do degradation - I would not be surprised if years from now we learn that the latest microcode updates only delayed the inevitable, not prevented it.
Microchips will experience electromigration when used. Best as I know, this applies to all of them. It applies to Global Foundries’ latest, it applies to all the chips TSMC makes for AMD and Nvidia, it certainly does apply to Intel Foundries’ products as well. It’s inevitable, only a question of how long until your nice EPIC or Threadripper or Snapdragon or SSD controller fails. That time may well be dozens of years past the general public’s use for the product, but (again, as I understand), it’s never going to be infinite, no matter how much you baby the chips, and failure with use is inevitable.

I don’t think this thread will attract such a person, but it’d be nice to have someone with in-depth knowledge confirm this.
Posted on Reply
#10
trparky
NoLoihiMicrochips will experience electromigration when used. As far as I know, this applies to all of them. It applies to Global Foundries’ latest, it applies to all the chips TSMC makes for AMD and Nvidia, it certainly does apply to Intel Foundries’ products as well. It’s inevitable, only a question of how long until your nice EPIC or Threadripper or Snapdragon or SSD controller fails. That time may well be dozens of years past the general public’s use for the product, but (again, as I understand), it’s never going to be infinite, no matter how much you baby the chips, and failure with use is inevitable.

I don’t think this thread will attract such a person, but it’d be nice to have someone with in-depth knowledge confirm this.
True, all chips will eventually succumb to electromigration. However, certain factors can exacerbate the issue and cause it to happen prematurely. In the case of Intel's 13th and 14th gen chips, running them with extreme voltage and excessive heat caused their chips to die prematurely.
Posted on Reply
#11
NoLoihi
trparkyTrue, all chips will eventually succumb to electromigration. However, certain factors can exacerbate the issue and cause it to happen prematurely. In the case of Intel's 13th and 14th gen chips, running them with extreme voltage and excessive heat caused their chips to die prematurely.
First off, if you read how people speak about this issue, this is not how they see it. They believe only some Intel chips suffer from this failure. Then, you’re right (I was literally pondering to edit in a sentence on which factors the intensity electromigration generally depends on, when I got notified of your reply :D ), though it should be pointed out that one source of making it so extreme was a mistake in the power-management firmware, through which they’ve gotten ran with very high voltage when doing C-state transitions (hope I’m remembering this right), so idling cores and putting them back to work again, which can happen very very often each session, depending on OS and usage patterns, so it accumulated just as if you had applied too much voltage while OC-ing and let it run with that.

Like, I should add, that power supply incursion very likely did nothing for performance, nothing at all, just a genuine screw up, yet people are insistently framing it as “Oh, look how karma is punishing greed by the minute! Those evil cretins, it’s gonna serve them so well. …” And yeah, uhh, no. Best you could do is argue about how quality control didn’t catch it.
Posted on Reply
#12
Wasteland
NoLoihiMicrochips will experience electromigration when used. As far as I know, this applies to all of them. It applies to Global Foundries’ latest, it applies to all the chips TSMC makes for AMD and Nvidia, it certainly does apply to Intel Foundries’ products as well. It’s inevitable, only a question of how long until your nice EPIC or Threadripper or Snapdragon or SSD controller fails. That time may well be dozens of years past the general public’s use for the product, but (again, as I understand), it’s never going to be infinite, no matter how much you baby the chips, and failure with use is inevitable.
Sure, eventually everything fails, but that's a truism. CPUs traditionally last much much much longer than they are relevant. They're widely considered the most reliable part in any given computer, and for good reason. Their failure rate, once they've passed initial validation, is minuscule. If you're troubleshooting, it's possible that the CPU is to blame, but there are almost always half a dozen other things to check off the list first.

If one particular CPU model cuts the expected lifespan from multiple decades down to a small handful of years, that's a capital-B Big Deal. I agree with @Vayra86 . The problem here is trust. With a Raptor Lake CPU, the troubleshooting calculus changes. If I had one, I'd be jumping at shadows. Ultimately a working chip that I can't trust is just as bad as a broken one. From a certain point of view, it may even be worse.
Posted on Reply
#13
NoLoihi
WastelandSure, eventually everything fails, but that's a truism. CPUs traditionally last much much much longer than they are relevant. They're widely considered the most reliable part in any given computer, and for good reason. Their failure rate, once they've passed initial validation, is minuscule. […]
If one particular CPU model cuts the expected lifespan from multiple decades down to a small handful of years, that's a capital-B Big Deal. […]
You see, I don’t think any major CPU vendor, not even Loongson, has been publishing any commitments to the public about how long they’re actually expecting (and engineering) their chips to last. The given number of decades beyond usefulness is not just anecdotal (I’m sure), and possibly overstated (even if you were to power on one of AMD’s original “x86” clone chips*, the 1975’s Am9080, it presumably wouldn’t be one that had been turned on and doing calculations for the past fifty years), there’s just literally no data on how long recent, small lithography, chips will last through actual use. They’re too new for that.
Let’s see how that turns out. AMD’s stuff since Zen 1 or 2 allegedly self-tunes for age (I remember reading about that, might have been on AnandTech), which, surely, might only make it run slower at first, but since they’ve already felt like they needed to engineer in some compensatory mechanism for age, there must have been good indications it’s going to become noticable within customers’ expected usage regime.
*It’s a clone of the 8080, so not really x86, and yes, I’ve had to look that bit up. ;)
[/HR]
It’d be nice to have external verification for sure, customers shouldn’t have to rely on companies’ word with how often (it feels) lies are caught. I, personally, wouldn’t mind buying Intel anew, yet maybe, when the time comes, it’ll be AMD due to price. (I have been burned with hyped AMD before and so I do hold a bit of a grudge. I’m also annoyed they always drag their feet at ancilliary support like codecs and stuff.) If I’m gonna buy for family and it’s a laptop, it’ll be price and battery life. We don’t game and performance from any vendor is basically plenty enough.
Posted on Reply
#14
bug
@NoLoihi If anything, older CPUs, built with much bigger transistors, are more likely to last longer.
Posted on Reply
#15
NoLoihi
bug@NoLoihi If anything, older CPUs, built with much bigger transistors, are more likely to last longer.
Yes, indeed. Conversely, this means that TSMC’s (future AMD’s, Nvidia’s, Apple’s) finest, N2, or Intel’s best, 18Å, are gonna be a liability for wear all on their own. :toast:
Posted on Reply
#16
bug
NoLoihiYes, indeed. Conversely, this means that TSMC’s (future AMD’s, Nvidia’s, Apple’s) finest, N2, or Intel’s best, 18Å, are gonna be a liability for wear all on their own. :toast:
It's not going to help, but part of migrating to smaller nodes is mitigating some of these loses.
Posted on Reply
#17
trparky
It doesn't help things that Intel has been running their latest chips hotter and hotter to be able to beat benchmarks.
Posted on Reply
#18
Wirko
NoLoihiYes, indeed. Conversely, this means that TSMC’s (future AMD’s, Nvidia’s, Apple’s) finest, N2, or Intel’s best, 18Å, are gonna be a liability for wear all on their own. :toast:
"Compared to N2, the new N1 node offers either 8% lower power consumption at same performance and lifetime, or 4% more performance and 7% shorter lifetime at same power. That's in addition to 10% higher transistor density (0.3% for SRAM)." :cheers: At the same time, Intel is busy developing new nodes in Powerpoint, while Samsung is just a distant memory :cheersagain: But AI chips will still pay for themselves three times over before they die.
Posted on Reply
#19
NoLoihi
trparkyIt doesn't help things that Intel has been running their latest chips hotter and hotter to be able to beat benchmarks.
Whereas AMD’s been running theirs hotter for the joy and unbridled fun it brings alone? :roll:
Posted on Reply
Add your own comment
Feb 21st, 2025 05:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts