Friday, March 27th 2020

Apple ARM Based MacBooks and iMacs to come in 2021

Apple has been working on replacing Intel CPUs in its lineup of products for a while now, and the first batch of products to feature the new Arm-based CPUs should be coming soon. Having a completely custom CPU inside it's MacBook or an iMac device will allow Apple to overtake control of the performance and security of those devices, just like they did with their iPhone models. Apple has proved that its custom-built CPUs based on Arm Instruction Set Architecture (ISA) can be very powerful and match Intel's best offerings, all while being much more efficient with a TDP of only a few Watts.

According to analyst Ming-Chi Kuo, Apple has started an "aggressive processor replacement strategy", which should give some results by the end of 2020, around Q4, or the beginning of 2021 when the first quarter arrives. According to Kuo, the approach of doing in-house design will result in not only tighter control of the system, but rather a financial benefit, as the custom processor will be 40% to 60% cheaper compared to current Intel CPU prices.
Apple 16-inch MacBook Pro
Source: AppleInsider
Add your own comment

98 Comments on Apple ARM Based MacBooks and iMacs to come in 2021

#76
watzupken
While there have been rumors on this switch for a long time, I feel Apple's A series SOCs are indeed ready to take on the low power Intel chips. I suspect this switch will happen in the MacBook Air space since the existing A12X/Z is capable of taking on Intel's 15W chips. In terms of software, I am sure developers are more than happy to optimize apps for the mobile SOCs for better performance , rather than running an emulated version.

I feel Intel's threat has always been the mobile SOCs, rather than AMD. Sure the mobile chips are nowhere as fast in say Windows 10, but as long as they are performing well enough and offering significant battery life/ and slimmer form factor, it will start to chip away at Intel's dominance in the laptop space.

Honestly, I am not sure why Crysis got specifically pulled into this discussion, but I do feel mobile SOCs are actually up to the task for light gaming. Think Nintendo Switch which is using an aged Tegra SOC and running quite a number of graphic intensive games. Sure there are a lot of compromises to get it to run, and a lot of serious optimizations, but it is a proof of concept that works. Also, if we are comparing integrated graphics, even Intel's graphics are rubbish and can barely play any of the modern games. Yet they sell very well on laptops because most people just don't care about GPUs that can run Crysis or any modern games smoothly. And if I apply this to MacBook Air, I feel most people that uses it are not really gamers since the Air is never meant to be a gaming laptop. Which is why I feel if Apple is to transit to ARM, then this is a good opening point.
Posted on Reply
#77
Frick
Fishfaced Nincompoop
ARFIt's not only Geekbench. I can compare the responsiveness of my phone with MediaTek MT6750T, an 8-core ARM phone-CPU with my desktop and to be honest the desktop feels less responsive, and yet consumes a heck of a lot more power.
This may have been adressed, but there is much to that discussion, especially when compared to Windows. Android doesn't have legacy stuff from several decades of operating systems, there is no such thing as a universal Android image (they have to be compiled to your specific circtuits), you don't use them the same way (that is getting muddier though)... There is no direct comparision, the same with ARM. Comparing performance directly between the two is very hard.
Posted on Reply
#78
ARF
FrickThis may have been adressed, but there is much to that discussion, especially when compared to Windows. Android doesn't have legacy stuff from several decades of operating systems, there is no such thing as a universal Android image (they have to be compiled to your specific circtuits), you don't use them the same way (that is getting muddier though)... There is no direct comparision, the same with ARM. Comparing performance directly between the two is very hard.
Having to compile for the specific circuit is an advantage, opportunity and strength, it means there is a probability you will get the maximum potential performance.
Smartphones are very responsive, fast and pleasant for using.
x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
This might be my subjective opinion but it is my feedback and you have to consider it.

The software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.

Tell me, is it normal that in RDR2 an overclocked Core i9-9900K at 5 GHz with a Radeon RX 590 gets 37 FPS on average at 1080p? www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

Is it normal that 14 years after Crysis' releases, the game is still an example of a badly optimised title that is being used as a benchmark for modern PC hardware?

I don't think it's normal.
Posted on Reply
#79
Aquinus
Resident Wat-man
ARFx86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
I wouldn't have been on skt2011 for 9 years if that was the case. Even mobile x86 machines are pretty fast today considering their TDP. I guess that dynamic changes if you're speaking about the skt939 machine in your specs which is 15 years old, but if you're comparing something like that to a new iPhone 11 Pro or something, then yeah, it's going to feel slow because it's a relic in comparison.
ARFThe software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.
You speak about this like you've never worked in the field of software development. As a software engineer, I take particular offense to this. No matter how good of a developer you are, you're likely going to write software that has bugs at one point or another. You're also assuming that it's the developer's fault most of the time. There have been plenty of cases where I've written code, wrote tests for it, was over 99% certain that it would do what it's supposed to, but what it was supposed to do was inaccurately conveyed to me, so the "bug" was a misunderstanding for how a feature was supposed to operate. I'll admit, that I'll be one of the first people to say, "work smarter, not harder," but not all bugs are the developer's fault. There are plenty of times where "what the applications is supposed to do," is communicated between people with something other than programming languages. Remember, when I'm given a task, I'm being told what to do. If someone can't tell me what they need or why they need it and I have to fill in the gaps because I can't get a good answer, then this is going to happen. That isn't to say there aren't bad devs, it's just not as clear cut as you would imagine.

With that said, if you don't like how developers do their jobs, I urge you to try and do it yourself and to do it better. Software engineers get paid decently for a reason and it's not because everyone can do it, and for those who can, not all do it well and salary usually reflects that if you don't get canned.
Posted on Reply
#80
Frick
Fishfaced Nincompoop
ARFHaving to compile for the specific circuit is an advantage, opportunity and strength, it means there is a probability you will get the maximum potential performance.
Sure. But it makes comparing hard, which was the point.
Smartphones are very responsive, fast and pleasant for using.
I don't think I have ever encountered a smartphone that is actually pleasant. Not even the proper high end ones. Everything has a very slight lag to it. But this could just be me.
x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
Disagree, which is fine.
This might be my subjective opinion but it is my feedback and you have to consider it.
One can have opinions about facts, and I know that the difference between facts and opinions is getting blurry to people (for many reasons) but one can still be wrong. The argument was that it is really hard to compare ARM directly to x86 (especially on Windows) because they are massively different. Me not thinking smarthphones are pleasant is subjective and is an opininon, the same as what you say about PC's, and that is fine. The problem is when Geekbench or whatever it is says ARM is faster than x86 here look at this graph, but it isn't that simple. Apple going for ARM in Macbooks and iMacs definitely tells us that ARM is screaming forward, which is a good thing ... but it's still silly to compare phones to computers in any scenario. You can do it of course, but it really should be responded to.

I won't even try to take on the thing about devs partly because Aquinus did and also because wow.
Tell me, is it normal that in RDR2 an overclocked Core i9-9900K at 5 GHz with a Radeon RX 590 gets 37 FPS on average at 1080p? www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html
I haven't played the game so I have no idea how it looks and if that performance makes sense, but on the whole yes. It's normal. The downside of gaming on PC and having a need/want to play on maxed out settings is it's a constant catch up. That's how it works.
Is it normal that 14 years after Crysis' releases, the game is still an example of a badly optimised title that is being used as a benchmark for modern PC hardware?
Also normal, because it's a meme at this point and no other game has that joke connected to it, so Crysis is the baseline in a way. And Crysis was very well optimised actually when you started fiddling with settings. You didn't need a beast machine for it to still look good. The maxed out everything settings at 4K is probably handled weirdly as it's still pretty hard to hit good FPS numbers there. But Crysis is an outlier. Some games are just weirdly made, the swamp levels in Jedi Outcast lagged pretty badly when last I played them, which was on a i3 machine which is very bad for a game based on the Quake 3 engine.
Posted on Reply
#81
Vya Domus
FrickThe maxed out everything settings at 4K is probably handled weirdly as it's still pretty hard to hit good FPS numbers there. But Crysis is an outlier.
I played Crysis and Crysis Warhead 4K60 maxed just fine.
ARFThe software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.
You're ignorant beyond any reasonable point. I changed my mind, I sincerely hope you're just trolling.
Posted on Reply
#82
THANATOS
ARFThese results are pretty interesting. Intel Atoms have compareable TDP but still their Multi-core performance is only 15% of the ARM-based counter-part, and Single-core performance is only 18% of an ARM-based counter-part.

That means Intel violates the EU standards for energy efficiency.
Are you seriously comparing Atom x5-Z8350 which was introduced in Q1 2016 against ARM based SoCs which were introduced in Q3-Q4 2019? That's 3.5 years difference!
BTW 2W for Atom x5-Z8350 is not TDP but SDP.
I don't know from where you got 2.5-3W TDP for those ARM SoCs.
Posted on Reply
#83
Vya Domus
THANATOSI don't know from where you got 2.5-3W TDP for those ARM SoCs.
Most high performance SoCs are around 10W power draw under burst loads.
Posted on Reply
#85
notb
ARF:kookoo: 10-watt in your smartphone will mean that you won't be able to touch your phone.
Why? You're definitely overestimating these 10W.

Mainstream smartphone SoCs are 5W, those for tablets reach 7W. I bet they boost higher.
That's roughly the same powee draw you see in ULV x86 chips, like Intel's -Y lineup.

Seriously, it's just transistors inside. You have to change the state of some number of them to perform a calculation. There's no magic.
Some optimizations are possible, but architectures made on similar node won't differ in efficiency by 10 times as you suggested earlier.
Posted on Reply
#86
Vya Domus
ARF10-watt in your smartphone will mean that you won't be able to touch your phone.
Oh really ? Guess some of us are superhuman when we pick up something like a 200W gaming laptop.
ARFQualcomm aims at 2.5 to 3W TDP for phones
fudzilla.com/31532-qualcomm-aims-at-25-to-3w-tdp-for-phones


www.notebookcheck.net/Qualcomm-Snapdragon-865-SoC-Benchmarks-and-Specs.448194.0.html
You don't need to use colossal fonts , it wont make any of your claims any more true than they are.

It clearly says right there it's a 5W TDP chip. Moreover take a look at this :



www.anandtech.com/show/15609/samsung-galaxy-s20-ultra-snapdragon-865-quick-performance-preview/2

How about that, the iPhone has an average well over 6 Watts. Mind you, that is average , common sense should make it obvious that these chips are going to boost for short periods of time well over that like any other chips mobile or desktop.
Posted on Reply
#87
Assimilator
claesA lot of the speculation here seems pretty far out of left field.

Apple just released a MacBook Air refresh, based on the design from 2018 + new keyboard. The previous redesign was in 2010. That's a life-cycle of 8 years.

MB Pro redesigns: 2012, 2016, 2019. 3-4 years for their most popular computer.

Mac Pro just saw it's first redesign since 2012...

They just released the new iPad Pro -- an Arm based tablet with keyboard and trackpad support.

There's no way they're going to trash the MBP, MP, and Air for ARM. A new product line to replace the MB and act as an AIO upgrade for the iPad Pro? Sure.

Ditching x86 entirely two years after a bunch of content creators just spend $6k+ on a new Mac Pro? Not a chance.

(Imagine the likes of Pixar holding onto MacOS while everyone else moved to Windows after the trashcan and Final Cut fiascos, their sigh of relief when the new Pro came out, only to find that their product is legacy three years after purchase. Imagine the developers of video editing software... Seriously lol stuff here).
You're ignoring a few things here:

1. All of Apple's x86 offerings are built around Intel CPUs. You know, the Intel that's having massive problems delivering those CPUs right now? CPUs that Apple doesn't have are CPUs that they can't put into their shiny Macbooks and charge a 300% markup on.
2. Apple's end goal absolutely is replacing x86 with Arm, for many reasons: they have the best Arm CPUs, they wouldn't have to pay Intel for CPUs, they wouldn't suffer when Intel has supply issues, and they can unify their OS and applications.
3. You really think Apple gives a s**t about what anyone paid last year for their overpriced junk? They don't, because they know they have a captive market. People who are dumb enough to buy Apple machines over PCs for any sort of task, are the same people who are going to buy Apple's latest and greatest every year, simply because Apple tells them to. Steve Jobs did a fantastic job of marketing to the "more money than sense" crowd. The same thing goes for the companies writing software for Apple machines.
Posted on Reply
#88
Aquinus
Resident Wat-man
ARF:kookoo: 10-watt in your smartphone will mean that you won't be able to touch your phone.
You do realize that these CPUs are designed to boost under thermally advantageous conditions and to throttle when it's not, right? They're not going to run at full tilt until it explodes.
Posted on Reply
#89
TheoneandonlyMrK
Vya DomusI played Crysis and Crysis Warhead 4K60 maxed just fine.
Same, might go for a third full run through em all , I have the time
THANATOSAre you seriously comparing Atom x5-Z8350 which was introduced in Q1 2016 against ARM based SoCs which were introduced in Q3-Q4 2019? That's 3.5 years difference!
BTW 2W for Atom x5-Z8350 is not TDP but SDP.
I don't know from where you got 2.5-3W TDP for those ARM SoCs.
Epic processors, I have a phone and a laptop with them in , they totes antihalation my main listed rig, in exactly one benchmark, making a phone call, my pc sucks at that.;p:D so does that laptop though.
Posted on Reply
#90
claes
AssimilatorYou're ignoring a few things here:
You basically ignored my entire post and seem to be acting more out of hatred for Apple than genuinely furthering this discussion, but isolation is boring so I’ll take the bait.
1. All of Apple's x86 offerings are built around Intel CPUs. You know, the Intel that's having massive problems delivering those CPUs right now?
”Now,” lol, where have you been?
CPUs that Apple doesn't have are CPUs that they can't put into their shiny Macbooks
Except they do, and still maintain exclusivity agreements — I think you just haven’t been paying attention.
and charge a 300% markup on.
lol, classic, good one
2. Apple's end goal absolutely is replacing x86 with Arm,
Ah, you’re on the board then? What else are they up to? What’s changed since 2014, when these rumors first started popping up?
for many reasons: they have the best Arm CPUs,
This is not a reason and avoids the significant task of rewriting the OS, drivers, apps, etc, as well as the transition and public and developer blowback. Do you know how many pro graphics apps stopped supporting Apple when the 2013 Mac Pro started showing its age, and how many professionals switched to Windows?
they wouldn't have to pay Intel for CPUs, they wouldn't suffer when Intel has supply issues,
This is a good and totally plausible reason that I agree with.
and they can unify their OS and applications.
Apple has repeatedly stated they have no interest in this — could you imagine selling the “it just works” campaign when people are trying to run Premier Pro on their iPad? There’s a reason they didn’t do this years ago...
3. You really think Apple gives a s**t about what anyone paid last year
Yes, I do, as indicated by their 2013 Mac Pro sales and the redesign...
for their overpriced junk?
Imagine rewriting the drivers for their afterburner card.
They don't, because they know they have a captive market. People who are dumb enough to buy Apple machines over PCs for any sort of task, are the same people who are going to buy Apple's latest and greatest every year, simply because Apple tells them to. Steve Jobs did a fantastic job of marketing to the "more money than sense" crowd. The same thing goes for the companies writing software for Apple machines.
In the case of consumers, sure, absolutely, and it makes total sense for them (thus my iPad Pro comments). Professionals? Definitely not, which was what my last three paragraphs were meant to emphasize.
Posted on Reply
#91
Master Tom
GoldenXEven thou 7nm is awesome, Zen2 still has some horrible power spikes, so I don't see Apple using them for now, or ever if they want to ditch x86/AMD64.
OTOH, RIP Intel.
What Power Spikes?
I have never heard of Ryzen Power Spikes before.
Posted on Reply
#92
notb
Assimilatorthey can unify their OS and applications.
That makes very little sense as noted by @claes.
Apple's Mac lineup flourished after 2005 because x86 made it much easier to provide ports of popular Windows software.

It's a complete ecosystem for photo/video editing, for software development, for science, for engineering.
And the *nix roots make it even more interesting for many.
With virtualization (be it Docker, Virtualbox or VMWare) pretty much every professional or scientific workflow can be easily migrated.
Even gaming becomes possible thanks to cloud platforms.

Going ARM without full and efficient x86 emulation will mean that software companies have to rewrite everything they want to sell to Apple customers. The cost of all that work would be enormous.
And whenever a large software company - e.g. Adobe, Mathworks, Microsoft, Autodesk, Oracle, VMWare or Salesforce - decided that they won't provide an ARM version of what they offer for macOS today, Apple would sell a few hundred thousand Macs less (in case of Adobe and Microsoft - probably millions).

That makes absolutely no sense - unless of course Apple's goal is to phase out Macs for whatever reason.


Posted on Reply
#93
Master Tom
TurmaniaI mocked Apple when they released iPhone. Thought it was such a stupid idea like many of us. How wrong we turned out to be! So I learned the hard way, do not judge before it comes out.
That is funny.
I was excited, when I first saw the iPhone.
Posted on Reply
#94
watzupken
ARFHaving to compile for the specific circuit is an advantage, opportunity and strength, it means there is a probability you will get the maximum potential performance.
Smartphones are very responsive, fast and pleasant for using.
x86 PCs are not responsive, very often they are laggy, cause stuttering, and in general not in any way pleasant for using.
This might be my subjective opinion but it is my feedback and you have to consider it.

The software developers are the members of our society who take the largest salaries and yet their products have bugs, backdoors, security vulnerabilities, along with being awfully optimised in the case of PC software, and the developers always seek to do the less job and seek the shortest and easiest path to accomplish the goal.

Tell me, is it normal that in RDR2 an overclocked Core i9-9900K at 5 GHz with a Radeon RX 590 gets 37 FPS on average at 1080p? www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

Is it normal that 14 years after Crysis' releases, the game is still an example of a badly optimised title that is being used as a benchmark for modern PC hardware?

I don't think it's normal.
I am no software developer, but I feel it is not something easy to optimize. I suspect that there are many considerations, performance, UX, features, security, etc, which a software developer will need to consider. To add on to the insult, if you look at the state of Windows and Android devices, there may be 1.01 million different configurations, with some people still running ancient hardware. To optimize software taking inconsideration as wide a support possible will certainly cause issues. This is why software and games tend to be a lot more optimized on hardware + software (drivers) that are tightly controlled by a company, think game consoles and perhaps even iOS devices. For bugs, I certainly have not seen any software that is bug free. At times when you add some codes to say introduce a new feature, you can very well break something or introduce some security issue. I believe this is inevitable. Also with different hardware in your computer introducing new driver/ features, it can also result in bugs being created in a software that works perfectly fine before the update. To sum up, basically there are too many moving parts for a software to be perfectly optimized and bug/ security free. However I also don't deny that there may be developers that don't care about optimizing.

On the point of decade old Crysis releases performing poorly on modern hardware, I don't think it is impossible. Game makers may no longer support the game after a number of years. For example, with up to 16 cores available now in the retail space, some older games are still optimized for 2 or 4 cores with no plans for future updates. This may well be the same on the GPU side. Game makers don't have infinite resources to continue support of ageing games, especially when they are no longer making recurring revenue from them. So it doesn't make sense to bench a 14 year old game in this case.
Posted on Reply
#95
THANATOS
ARFQualcomm aims at 2.5 to 3W TDP for phones
fudzilla.com/31532-qualcomm-aims-at-25-to-3w-tdp-for-phones


www.notebookcheck.net/Qualcomm-Snapdragon-865-SoC-Benchmarks-and-Specs.448194.0.html
That article from Fudzilla was published on 30 MAY 2013 and there is no mention of 2.5-3W TDP for Snapdragon-865 or Apple A13 Bionic, not surprising considering they were introduced in 2H of 2019.
I don't understand why you even linked It when right under It your next link is about 5W TDP for Snapdragon-865.
Posted on Reply
#96
ARF
THANATOSThat article from Fudzilla was published on 30 MAY 2013 and there is no mention of 2.5-3W TDP for Snapdragon-865 or Apple A13 Bionic, so I don't understand why you even linked It when your next link is about 5W TDP for Snapdragon-865.
So as I said your comparison was simply wrong.
5-watt is when the SoC works at maximum load of its iCPU and iGPU parts. Its iCPU part works at 2.5-3 watt or lower..
Posted on Reply
#97
Master Tom
I think that it is Windows fault, that PC are not very responsive.
Try Linux or macOS. They are much better designed.
Posted on Reply
#98
Assimilator
notbThat makes very little sense as noted by @claes.
Apple's Mac lineup flourished after 2005 because x86 made it much easier to provide ports of popular Windows software.

It's a complete ecosystem for photo/video editing, for software development, for science, for engineering.
And the *nix roots make it even more interesting for many.
With virtualization (be it Docker, Virtualbox or VMWare) pretty much every professional or scientific workflow can be easily migrated.
Even gaming becomes possible thanks to cloud platforms.

Going ARM without full and efficient x86 emulation will mean that software companies have to rewrite everything they want to sell to Apple customers. The cost of all that work would be enormous.
And whenever a large software company - e.g. Adobe, Mathworks, Microsoft, Autodesk, Oracle, VMWare or Salesforce - decided that they won't provide an ARM version of what they offer for macOS today, Apple would sell a few hundred thousand Macs less (in case of Adobe and Microsoft - probably millions).

That makes absolutely no sense - unless of course Apple's goal is to phase out Macs for whatever reason.

Perhaps I've been a little obtuse, so here's a clarification:

Apple's goal is not to phase out Macs. It's to phase out x86 Macbooks.

Its so-called high-end workstations are expensive enough and low volume enough that Apple can continue to use Intel chips for them. They'll put the price up since they won't be getting as big a volume discount from Intel, though.

Apple is confident that it already has Arm versions of enough of the software in its Macbook ecosystem to cover enough of its users. The holdouts are either insignificant, or basically don't exist anywhere except in the Apple ecosystem, so the onus is on those devs to port their code or lose their revenue stream.
Master TomI think that it is Windows fault, that PC are not very responsive.
Try Linux or macOS. They are much better designed.
0/10 trolling effort.
Posted on Reply
Add your own comment
Nov 21st, 2024 14:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts