Thursday, June 25th 2020
Bad Intel Quality Assurance Responsible for Apple-Intel Split?
Apple's decision to switch from Intel processors for its Mac computers to its own, based on the Arm architecture, has shaken up the tech world, even though rumors of the transition have been doing rounds for months. Intel's first official response, coupled with facts such as Intel's CPU technology execution being thrown completely off gear due to foundry problems; pointed toward the likelihood of Intel not being able to keep up with Apple's growing performance/Watt demands. It turns out now, that Intel's reasons are a lot more basic, and date back to 2016.
According to a sensational PC Gamer report citing former Intel principal engineer François Piednoël, Apple's dissatisfaction with Intel dates back to some of its first 14 nm chips, based on the "Skylake" microarchitecture. "The quality assurance of Skylake was more than a problem," says Piednoël. It was abnormally bad. We were getting way too much citing for little things inside Skylake. Basically our buddies at Apple became the number one filer of problems in the architecture. And that went really, really bad. When your customer starts finding almost as much bugs as you found yourself, you're not leading into the right place," he adds.It was around that time that decisions were taken at the highest levels in Apple to execute a machine architecture switch away from Intel and x86, the second of its kind following Apple's mid-2000s switch from PowerPC to Intel x86. For me this is the inflection point," says Piednoël. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform." Apple's decision to dump Intel may have only been more precipitated with 2019 marking a string of cybersecurity flaws affecting Intel microarchitectures. The PC Gamer report cautions that Piednoël's comments should be taken with a pinch of salt, as he has been among the more outspoken engineers at Intel.Image Courtesy: ComputerWorld
Source:
PC Gamer
According to a sensational PC Gamer report citing former Intel principal engineer François Piednoël, Apple's dissatisfaction with Intel dates back to some of its first 14 nm chips, based on the "Skylake" microarchitecture. "The quality assurance of Skylake was more than a problem," says Piednoël. It was abnormally bad. We were getting way too much citing for little things inside Skylake. Basically our buddies at Apple became the number one filer of problems in the architecture. And that went really, really bad. When your customer starts finding almost as much bugs as you found yourself, you're not leading into the right place," he adds.It was around that time that decisions were taken at the highest levels in Apple to execute a machine architecture switch away from Intel and x86, the second of its kind following Apple's mid-2000s switch from PowerPC to Intel x86. For me this is the inflection point," says Piednoël. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform." Apple's decision to dump Intel may have only been more precipitated with 2019 marking a string of cybersecurity flaws affecting Intel microarchitectures. The PC Gamer report cautions that Piednoël's comments should be taken with a pinch of salt, as he has been among the more outspoken engineers at Intel.Image Courtesy: ComputerWorld
81 Comments on Bad Intel Quality Assurance Responsible for Apple-Intel Split?
Fake news so
And Intel they won't catch up anymore the big bang in cpu race has started and the main players will be ARM Aarch64 vs AMD x86 or Windows 10 ARM64 vs Windows 10 x64.
Intel will still sell for people who are fans or don't know about real things.
Apple is preparing this for at least 5 years.
I really wish people would stop abusing the term "fake news".
www.anandtech.com/show/15881/amd-succeeds-in-its-25x20-goal-renoir-zen2-vega-crosses-the-line-in-2020
Beyond the GPU, you have things like media encoders/decoders (ARM processors aren't great at doing software video decoding and are even worse at encoding), network accelerators, crypto accelerators, etc. I mean, Apple provided a great example of this themselves.
This is sort of the core advantage of x86/x64, the CPU cores are a lot more multi-purpose and can process a lot of different data "better" than ARM cores. Obviously some of this comes down to software optimisation and some to pure raw GHz, as most ARM SoCs are still clocked far slower than the equivalent x86/x64 parts. However, as power efficient as ARM processors are, there are a lot of things they're unlikely to overtake the x86/x64 processors in doing, at least not in the foreseeable future.
Relying on accelerators/co-processors does have some advantages as well, as you can fairly easily swap out one IP block for another and have a slightly different SKU. I'm not sure this fits the Apple business model though. I guess they could also re-purpose a lot of the IP blocks between different SoC SKUs. The downside is as pointed out above, that if your SoC lacks an accelerator for something, you simply can't do it. Take Google's VP9 for example. It can quite easily be software decoded on an x86/x64 system, whereas on ARM based systems, you simply can't use it, unless you have a built in decoder specifically for that codec.
This also makes for far more complex SoCs and if one of these sub-processors fail, you have a dud chip, as you can't bin chips as a lower SKU if say the crypto accelerator doesn't work.
It's going to be interesting to see where Apple ends up, but personally I think this will be a slow transition that will take longer than they have said.
It'll also highly depend on Apple's customers, as I can't imagine everyone will be happy about this transition, especially those that dual boot and need access to Windows or another OS at times.
That’s not to say you aren’t right that Apple decided to go their own way anyway, but Apple has always sought good battery life in their mobile products, and building their own architecture was something they could afford to do.
But regardless, the push to ARM meant Apple would never even have given AMD a look. And even if they didn't have ARM at all and AMD was the top dog in performance, they still likely wouldn't have given AMD a look, because they'd have had to figure out how to integrate AMD's CPUs and chipsets into their Macs, which given AMD's terribly lacking support infrastructure for integration, would have been an absolute nightmare.
So no, AMD never had a chance. Oh hey look, it's another article based on useless synthetic S**tbench 4 results, which means it's idiotic clickbait that should be ignored.
Apple just thought it could do more with less. Intel chips on x86 are not the epitome of efficiency. The world and especially mobile is asking for high efficiency. Fixed devices are becoming more mobile and the desire for smaller and varied form factors is a push that is still going on.
Developments on x86 are not moving in that direction - it is even getting harder to keep small devices cool and quiet AND performant within the niche Apple is looking for.
AMD cannot offer that either. This is a long term strategic move, Intel is just a supplier.