Tuesday, July 26th 2022

Apple Removes Remaining Intel Components from M2 MacBooks

Apple has removed the final remaining Intel components from its latest M2 MacBooks with the Intel JHL8040R USB4 timer chips being replaced with a pair of custom U09PY3 chips. This change was discovered by iFixIt during a recent teardown and documented by Twitter user SkyJuice with the exact reason for the change unknown. This move towards alternative USB4 chips is also present with AMD's latest Rembrandt laptops switching to retimers such as the KB8001 'Matterhorn' from Swiss startup Kandou who claim to supply five of the six largest PC OEMs with such chips.
Source: iFixIt (via @SkyJuice60)
Add your own comment

35 Comments on Apple Removes Remaining Intel Components from M2 MacBooks

#1
MarsM4N
Guess they have to print new stickers then. :cool:



Apple sold 28,958,000 iMacs in 2021, so even if they save only 50 cent per chip it's still a whooping 1.447.900$.
But I don't think they did it for the money. They mostly wanted to get more security.
Posted on Reply
#2
R-T-B
MarsM4NBut I don't they did it for the money. They mostly wanted to get more security.
I doubt it has anything to do with either of those. This is (like many apple things) all about control and bringing more silicon under their umbrella of control is in line with that.
Posted on Reply
#3
dgianstefani
TPU Proofreader
Walled garden bullshit. Same reason they refuse to use USB C.
R-T-BI doubt it has anything to do with either of those. This is (like many apple things) all about control and bringing more silicon under their umbrella of control is in line with that.
Posted on Reply
#4
Bomby569
MarsM4NGuess they have to print new stickers then. :cool:



Apple sold 28,958,000 iMacs in 2021, so even if they save only 50 cent per chip it's still a whooping 1.447.900$.
But I don't think they did it for the money. They mostly wanted to get more security.
I doubt there was any savings with the money they spent on developing their chips, i bet they lost money and a lot.
Posted on Reply
#5
Chaitanya
So where are they getting their thunderbolt controllers from?
Posted on Reply
#6
Fourstaff
I am impressed with their dedication. Wonder what they are trying to achieve here.
Posted on Reply
#7
Bomby569
FourstaffI am impressed with their dedication. Wonder what they are trying to achieve here.
i think this all started because of the increasing power consumption and heat of x86 cpu's
Posted on Reply
#8
MarsM4N
R-T-BI doubt it has anything to do with either of those. This is (like many apple things) all about control and bringing more silicon under their umbrella of control is in line with that.
Agree, they focus more on "in house" production. Full independence. :)

Maybe they try to avoid supply chain disruptions in the future. Was quite a thing during pandemic for Apple.

Posted on Reply
#9
Aquinus
Resident Wat-man
FourstaffI am impressed with their dedication. Wonder what they are trying to achieve here.
Probably the same thing that they have in the past, vertical integration. Just look at some of the bigger corporations in Japan that have done the same thing, they're some of the most successful as well.
Posted on Reply
#11
Fourstaff
Bomby569i think this all started because of the increasing power consumption and heat of x86 cpu's
That's fair, but why continue to this level?
AquinusProbably the same thing that they have in the past, vertical integration. Just look at some of the bigger corporations in Japan that have done the same thing, they're some of the most successful as well.
At some point they are going to hit diminishing returns with this strategy though. They probably think they haven't hit that point yet.
Posted on Reply
#12
AnarchoPrimitiv
Bomby569i think this all started because of the increasing power consumption and heat of x86 cpu's
Not so much AMD's side, their Rembrandt chips are very efficient, their desktop chips are more efficient than Intel's too and there's no doubt their GPUs will win on efficiency too with the upcoming generation.... I'd be willing to bet that AMD could push efficiency even more if it wasn't for Intel and Nvidia producing seriously power hungry chips and basically requiring AMD to follow suit to compete on total performance. From AMD's point of view, the fact that consumers are buying Intel' s Alder lake and Nvidia GPUs and will continue to buy Raptor lake and Lovelace despite atrocious power consumption signals to them that efficiency doesn't sell, so why bother down that path.

BTW, this is a PERFECT example of the rebound effect (the observation that technological gains in efficiency have literally never resulted in a net decrease in energy consumption), and how the "free market" fails at delivering what is needed now more than ever: lower NET power consumption. I know most people would scream bloody murder, but perhaps there should be greater regulation on power consumption for consumer CPUs and GPUs... In the current environment this would produce a huge advantage for AMD though (testing in reviews have shown that when Rembrandt and Alder lake mobile chips are limited to 15w, for example, the Rembrandt chip outperforms Alder lake and based on all the reports that Nvidia is pushing Lovelace power consumption to the absolute limit because they think they need to to beat RDNA3, if GPUs were power limited it probably negatively impact Nvidia), so it would undoubtedly result in companies like Nvidia and Intel, but many more as well and probably even AMD just so that they have increased power limits as an option, lobbying and legally bribing (i.e. Campaign contributions) government officials to prevent such a necessary measure.

Anyway, x86 does NOT have to be inefficient, and can perform most of that most people require with low power consumption. Maybe not as well as ARM in some applications, but perhaps in the future something can be altered with x86 to address that.
Posted on Reply
#13
LFaWolf
dgianstefaniWalled garden bullshit. Same reason they refuse to use USB C.
My M1 MacBook Air has 2 usb-c and my 2017 MacBook Pro has 3 or 4 usb-c
Posted on Reply
#14
dgianstefani
TPU Proofreader
LFaWolfMy M1 MacBook Air has 2 usb-c and my 2017 MacBook Pro has 3 or 4 usb-c
Apple sells over 228 million iphones which use Lightning, compared to around 20 million macs per year.
Posted on Reply
#17
medi01
Bomby569i think this all started because of the increasing power consumption and heat of x86 cpu's
Even 4000 series mobile Ryzen beat M1 at perf/watt in a number of tasks, native vs native, despite being on an inferior process.

This all apparently started because of the wish to increase margins even further.
Posted on Reply
#18
TheUn4seen
FourstaffI am impressed with their dedication. Wonder what they are trying to achieve here.
My guess is money and control over the supply chain, leading to more money in the long term. Obsession with controlling their suppliers seems to be a theme with Apple.
Posted on Reply
#19
JustBenching
AnarchoPrimitivNot so much AMD's side, their Rembrandt chips are very efficient, their desktop chips are more efficient than Intel's too and there's no doubt their GPUs will win on efficiency too with the upcoming generation.... I'd be willing to bet that AMD could push efficiency even more if it wasn't for Intel and Nvidia producing seriously power hungry chips and basically requiring AMD to follow suit to compete on total performance. From AMD's point of view, the fact that consumers are buying Intel' s Alder lake and Nvidia GPUs and will continue to buy Raptor lake and Lovelace despite atrocious power consumption signals to them that efficiency doesn't sell, so why bother down that path.

BTW, this is a PERFECT example of the rebound effect (the observation that technological gains in efficiency have literally never resulted in a net decrease in energy consumption), and how the "free market" fails at delivering what is needed now more than ever: lower NET power consumption. I know most people would scream bloody murder, but perhaps there should be greater regulation on power consumption for consumer CPUs and GPUs... In the current environment this would produce a huge advantage for AMD though (testing in reviews have shown that when Rembrandt and Alder lake mobile chips are limited to 15w, for example, the Rembrandt chip outperforms Alder lake and based on all the reports that Nvidia is pushing Lovelace power consumption to the absolute limit because they think they need to to beat RDNA3, if GPUs were power limited it probably negatively impact Nvidia), so it would undoubtedly result in companies like Nvidia and Intel, but many more as well and probably even AMD just so that they have increased power limits as an option, lobbying and legally bribing (i.e. Campaign contributions) government officials to prevent such a necessary measure.

Anyway, x86 does NOT have to be inefficient, and can perform most of that most people require with low power consumption. Maybe not as well as ARM in some applications, but perhaps in the future something can be altered with x86 to address that.
If people understand that efficiency needs to be compared at same wattage maybe then these myths about amd being more efficient will stop being repeated all the time
Posted on Reply
#20
LFaWolf
dgianstefaniApple sells over 228 million iphones which use Lightning, compared to around 20 million macs per year.
But, the news article is about MacBook, no? Why talk about iPhone ?
Posted on Reply
#21
dgianstefani
TPU Proofreader
LFaWolfBut, the news article is about MacBook, no? Why talk about iPhone ?
Please consider the concept of related and common topics before making such a comment...
Posted on Reply
#22
medi01
fevgatosefficiency needs to be compared at same wattage
That is not how products are used.
Posted on Reply
#23
LFaWolf
dgianstefaniPlease consider the concept of related and common topics before making such a comment...
Well, even though they are made by the same company, the products are vastly different, and run by different divisions of the company. Design decisions and sourcing of components are done by different people, that may or may not even talk to each other.

it is nearly comparing apples and oranges. It is like me ranting how much I dislike Windows server 2016 because my Xbox died of the red ring of death. They are irrelevant.
Posted on Reply
#24
JustBenching
medi01That is not how products are used.
It is how products are used. Cpus have a one click button that sets up maximum wattage you want it to draw. The only people that don't use it are the people that want to complain that a product isnt efficient.

That could be the start of a good joke
- i have my cpu pulling 900 watts
- why?
- so i can complain on the forums that its not efficient
Posted on Reply
#25
Lew Zealand
dgianstefaniApple sells over 228 million iphones which use Lightning, compared to around 20 million macs per year.
Apple settled on the Lightning connector before USB-C existed. And then the Android industry squabbled about on a number of competing USB-based connections for years before eventually arriving on the decent USB-C standard. Seems like the best business move to make, stay with the best design for you, wait until everyone else gets their act together and then ride out your design until required to change.
Posted on Reply
Add your own comment
Nov 21st, 2024 13:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts