Thursday, June 25th 2020

Bad Intel Quality Assurance Responsible for Apple-Intel Split?

Apple's decision to switch from Intel processors for its Mac computers to its own, based on the Arm architecture, has shaken up the tech world, even though rumors of the transition have been doing rounds for months. Intel's first official response, coupled with facts such as Intel's CPU technology execution being thrown completely off gear due to foundry problems; pointed toward the likelihood of Intel not being able to keep up with Apple's growing performance/Watt demands. It turns out now, that Intel's reasons are a lot more basic, and date back to 2016.

According to a sensational PC Gamer report citing former Intel principal engineer François Piednoël, Apple's dissatisfaction with Intel dates back to some of its first 14 nm chips, based on the "Skylake" microarchitecture. "The quality assurance of Skylake was more than a problem," says Piednoël. It was abnormally bad. We were getting way too much citing for little things inside Skylake. Basically our buddies at Apple became the number one filer of problems in the architecture. And that went really, really bad. When your customer starts finding almost as much bugs as you found yourself, you're not leading into the right place," he adds.
It was around that time that decisions were taken at the highest levels in Apple to execute a machine architecture switch away from Intel and x86, the second of its kind following Apple's mid-2000s switch from PowerPC to Intel x86. For me this is the inflection point," says Piednoël. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform." Apple's decision to dump Intel may have only been more precipitated with 2019 marking a string of cybersecurity flaws affecting Intel microarchitectures. The PC Gamer report cautions that Piednoël's comments should be taken with a pinch of salt, as he has been among the more outspoken engineers at Intel.Image Courtesy: ComputerWorld
Source: PC Gamer
Add your own comment

81 Comments on Bad Intel Quality Assurance Responsible for Apple-Intel Split?

#26
R0H1T
AssimilatorThat gels well with Apple's China-like method of market domination: make intricate long-term plans, don't tell anyone about them, execute them flawlessly, and only reveal them when they're so far along nothing can be done to stop them.
Let's not pretend anyone in Apple's position hasn't done or tried to pull the same ~ IBM did it, heck Intel does it, MS has done it, so does Google, FB, Amazon or Alibaba. For some reason your posts on Apple recently sound churlish, almost like a grownup teenager hating on them for being exorbitantly priced & executing a lot on what others can only dream of! Love it or hate it Apple has the best perf/w parts, under 15W, on a 2 year old node & a (uarch) design which is likely 3(?) years old.
Posted on Reply
#27
FordGT90Concept
"I go fast!1!11!1!"
That makes no sense unless the Apple Skylake issues were specifically related to the implementation for Apple. I mean, other than this alleged whistleblower, there's no public documentation about issues with Skylake, is there (beyond the broad security vulnerabilities affecting many Intel architectures)?

On top of that, why Apple ARM over AMD x86? Apple already had a strong relationship with AMD over Radeon so it would make sense to extend that to include CPUs.

The more plausible explanation is that, yes, it's engineering but it's Apple's engineering, not Intel's. Since the first MacBooks were sold, Apple has had serious engineering and quality control problems with their products from inadequate cooling to bad solder joints to bad design. The one product Apple hasn't had any major engineering problems with was the iPad. By eliminating their partnership with Intel, they can expand on the one product they have that isn't fundamentally flawed and increase profit margins in the process.
Posted on Reply
#28
Makaveli
davideneco"citing former Intel principal engineer François Piednoël"
Fake news so
lol François Piednoël had a 20 year career at intel as a principal engineer check his linked in.

We are suppose to believe you over him?
Posted on Reply
#29
R0H1T
His credentials don't make him an automatic authority over why the switch happened, there have been rumors floating around about Apple switching to ARM for nearly a decade now. I said this over at AT in 2012-13 timeframe IIRC, he's an attention seeker for sure & SKL was never a trigger for Apple unless he can cite actual sources!
Posted on Reply
#30
Makaveli
R0H1THis credentials don't make him an automatic authority over why the switch happened, there have been rumors floating around about Apple switching to ARM for nearly a decade now. I said this over at AT in 2012-13 timeframe IIRC, he's an attention seeker for sure & SKL was never a trigger for Apple unless he can cite actual sources!
No but it makes so he had access to more internal information at intel than anyone on this forum.

So it does matter if you are going to say he is wrong provide a reason why and how.

Some random on a forum calling him out saying its fake news is laughable.
Posted on Reply
#31
Assimilator
Something I find peculiar: wasn't Piednoël, as chief engineer, responsible for Skylake? If so, is he effectively admitting that he did a bad job, and that by extension, Apple's split with Intel is effectively his fault?
R0H1TLet's not pretend anyone in Apple's position hasn't done or tried to pull the same ~ IBM did it, heck Intel does it, MS has done it, so does Google, FB, Amazon or Alibaba. For some reason your posts on Apple recently sound churlish, almost like a grownup teenager hating on them for being exorbitantly priced & executing a lot on what others can only dream of! Love it or hate it Apple has the best perf/w parts, under 15W, on a 2 year old node & a (uarch) design which is likely 3(?) years old.
Having the best perf/W ARM CPU is not something to write home about. It's like having the fastest bicycle at an F1 grand prix.

And I'm sorry that calling out Apple on its BS makes you feel personally attacked... wait, no I'm not. If your ego is so tied to a faceless multinational corporate's products that you take any criticism of said corporation personally, guess what - that's your issue, not mine!
Posted on Reply
#32
R0H1T
Huh & I'm sorry you're spending so much time attacking a faceless, emotionless corporation just because they ruin the name of fruity loops. Your posts on Apple over the last few days have been hilarious, I'll reiterate ~ love it or hate it they're the best at what they're doing. As for perf/w not being important, what weed are you smoking? This is literally one of the most important metrics in computing ~ why do you think Intel lead IBM, AMD or Nvidia beat AMD, ARM beat Intel in mobiles? Are you living in a parallel universe where power comes free of cost or heat?
MakaveliSome random on a forum calling him out saying its fake news is laughable.
Yes, that's rather unfortunate. I blame the one who popularized this term :shadedshu:
Posted on Reply
#33
Chomiq
AssimilatorOh hey look, it's another article based on useless synthetic S**tbench 4 results, which means it's idiotic clickbait that should be ignored.
Better?
Wendell sure isn't full of shit when it comes to knowing what's what.
Posted on Reply
#34
Ashtr1x
All the drama. Sigh.

Here's my take at this whole APPLE fiasco.

Apple wants full control of everything they make. PERIOD. let's look at Dialog Semi they make the Power ICs for their iPhones, responsible for batteries and all, when they got caught red handed with Throttlegate the top execs threw the Dialog Semi under the bus, same for the company which used to make them the Sapphire glass GT Advanced, bankrupt now. Same for Imagination IP after poaching key personnel that company is now sold to a Chinese firm.

Every single OEM that does business with Apple gets caught in the cross fire and gets shoved into blackhole because of deep pockets. Apple poaches a lot from their own supplier companies. Look at the Qualcomm spat. That scales long way back, it was WiMax that Apple was threatening to use form Intel instead of Qualcomm technology and now Qcomm's royality scheme was demanding money from Apple because their phones were selling at higher profits, Apple didn't like it. Bam, that whole fiasco happened. Even that Hock Tan at Broadcom was in hands with Apple when CFIUS intervention happened to stop the hostile takeover, because Apple would benefit for the LTE patents that Hock Tan would sell to them, it's all over at EE of this history. And also that Intel 5G IP purchase for mobile, stupid Intel made a big mistake there, instead of keeping it to them they sold it to Apple, this all happens because of the top brass who gets sold out, best eg is Stephen Elop, Nokia ex-MS employee sold the company to them and we all know what happened, bad management at Nokia and MS lack of focus and open nature they lost everything on mobile.

This BS corporation that tells privacy is our first was responsible for the KickAssTorrents site owner Polish citizen extradition to USA from IP trace linked to Apple. We know blackmarket devices break iPhone's crypto too, this all security BS drama is just a face of this faceless corporate, a blackbox is always a blackbox and putting trust into them is foolish esp with Apple where 100% of any of their services cannot be used unless you sign into their system.

Now back to present, they pump more money into these Chinese firms like BOE as well to make their displays for cheap, like LG got dethroned recently from LCD top rank since BOE is approaching, all thanks to Apple cash influx, you may ask why ? Because they want profit margins to be the as much as efficient as possible. They do every single damn thing for money and their gains only. All this pandering in the WWDC and every single thing that their CEO does is just a PR mask corporate talk. This Intel/Apple split was there in news since a long time, and since Apple already pays hefty amounts of money to Adobe, MS and other comapnies to make the software optimzied for Apple like first party solutions they thought instead of doing this BS and getting caught by using cheap bs VRM components in their piss poor trash BGA cooling systems and paying money to Intel, paying to lawyers atop for all their sneaky BS practices. And most important of all they are getting their major profits from iPhone 57%, and their new Services business is 17.7%, Mac is just at sub 10%, and to pay such huge amounts doesn't make any sense. Plus Mac OS X is not the same old Macintosh OS anymore, their 32bit hammer drop and more of iOS crap shoved into the Macintosh OS and the BGA BS of it with less I/O and T2 guarding everything, users who use that Mac as a *nix substutue are falling, its just a fancy POS machine in majority of the corporations, enterprises. That's the reason. And we know how much expensive their A series cores take and they even pumped so much of money into TSMC for the R&D of 7nm, that money ROI is expected very much, that's why they make all their second class products (not iPhones) use older processors ALWAYS. So this is a natural step towards their maximum efficiency.

The whole ARM vs x86 is an overblown discussion, esp that Anandtech's SPEC BS scores. Any top flagship Android phone trails so much in SPEC but when it comes to the Application performance including the friggin first party apps like Camera and all, the iPhone A series SoCs don't beat the Android flagships in any form, its just a BS metric. Until a Cinebench or Blender, or any OpenGL / Vulkan (their own Metal BS) performance comes their A series BS can shove where sun doesn't shine, ARM is always custom and lockeddown hard, Bootloader gets locked on many phones now on Android unlike Nexus days, x86 enjoys decades of software optimization and library to choose from and massive Windows Win32 ecosystem, sadly UWP trash is coming lately. People who are like we want ARM in our PCs are fools and the ones who beleive that Google and MS should chase same BS are even bigger idiots because they don't have the same business line or principles or even the R&D investments, Qualcomm of all invested millions into Server ARM race and canned it entiurely from the heralding by Cloudflare marketing. Imagine people thinking MS and Google will build these processors. Ultimate fools lmao.
Posted on Reply
#35
Steevo
That is impressive performance from Apples chips, but I think the idea that there are hardware accelerators for everything is the reason as others have said. It's also the planned obsolescence model, oh, this new update requires hardware accelerators that your old models don't have, so you either will get no updates, or have very poor performance if you don't upgrade.


There are so many reasons this is happening, blaming Intel is just one part, I bet there are all sorts of bugs in their hardware design, but keeping it under wraps and in their walled garden means few will have the full story.
Posted on Reply
#36
Vya Domus
What intrigues me the most is, why the hell was Apple so involved in the development of Intel's architectures ? I mean this doesn't seem like a simple collaboration with a customer that got the end product, it looks to me like they had access to some pretty deep and low level engineering that Intel was doing from early on in the development process. I know Apple was an important customer but it just seem odd they'd have so much access to all of this, I wonder how much know-how "migrated" to Apple in all of these years. Maybe that was the goal altogether.
FordGT90ConceptOn top of that, why Apple ARM over AMD x86?
Cost, they made it pretty much clear that they will use more or less the same SoCs that are present in their phones/tablets, which are cheap to make. It's probably cheaper for them in the long run to make all of the silicon on their own. What many still don't understand is that Apple doesn't need the highest performance, their customers aren't looking for that, otherwise they'd buy something else. What they want is the software, in other words Apple just needs to provide something good enough hardware wise.
Posted on Reply
#37
SamuelL
I have no reason to think this article is inaccurate, though I don’t think skylake issues had anything to do with Apple’s long-term plans. There have been rumors about Apple moving to unified mobile and desktop architecture/OS since Jobs was still leading the company.

Simply put, Apple earns more on their hardware and retains more control if they use their own chips. It’s a business decision and always has been with Apple. It just so happened that they had a really good run between (roughly) 2003-2013, where each change and new product benefitted both consumers and their bottom line. Now people get up in arms and feel compelled to find reasons when Apple makes decisions like this. Apple makes products to benefit Apple, they don’t care about which processor architecture or it’s relative performance so long as it brings in more money.
Posted on Reply
#38
Makaveli
Apple loves their walled garden and having full control of their product stack. I think this move was going to be inevitable.
Posted on Reply
#39
Regenweald
As others have said it's just revenue and ebitda, period.
We already design and fab our own chips
What is the cost of a more powerful arm design and the production wafers, and the product engineering therafter ?
What is the cost of buying chips form intel, and the product engineering therafter ?
Whichever one costs less is better for the books. the end.
Posted on Reply
#40
FordGT90Concept
"I go fast!1!11!1!"
Vya DomusCost, they made it pretty much clear that they will use more or less the same SoCs that are present in their phones/tablets, which are cheap to make. It's probably cheaper for them in the long run to make all of the silicon on their own. What many still don't understand is that Apple doesn't need the highest performance, their customers aren't looking for that, otherwise they'd buy something else. What they want is the software, in other words Apple just needs to provide something good enough hardware wise.
If that's the case then the Mac Pro's days are numbered. The model available now may be the last.
Posted on Reply
#41
DemonicRyzen666
I had a run in with this "François Piednoël" on xtreme systems. I had found pdf of mention that intel cpu's could use AVX, if it can shuffle it to shorter code from sse2-sse4. This increases gaming performance. He denied this and not only that he had the the pdf removed! Now there plenty of proof that games use AVX instruction set now. No game should actually require AVX to run.
Posted on Reply
#42
Valantar
AssimilatorSomething I find peculiar: wasn't Piednoël, as chief engineer, responsible for Skylake? If so, is he effectively admitting that he did a bad job, and that by extension, Apple's split with Intel is effectively his fault?
While I agree that something like this is unusual, ultimately I would say it shows some integrity. Owning up to the work you led being worse than anticipated is something that people should do a lot more of.
AssimilatorHaving the best perf/W ARM CPU is not something to write home about. It's like having the fastest bicycle at an F1 grand prix.
Perf/W is one thing, AnandTech's SPEC testing shows that Apple's current mobile chips are ahead of Skylake and its derivatives in IPC. If it also scales up to 4GHz+ at reasonable power, those chips will be pretty powerful.
Vya DomusWhat intrigues me the most is, why the hell was Apple so involved in the development of Intel's architectures ? I mean this doesn't seem like a simple collaboration with a customer that got the end product, it looks to me like they had access to some pretty deep and low level engineering that Intel was doing from early on in the development process. I know Apple was an important customer but it just seem odd they'd have so much access to all of this, I wonder how much know-how "migrated" to Apple in all of these years. Maybe that was the goal altogether.
Nothing in this says they were involved in the development of the architecture, only that they were reporting architectural bugs to Intel once they got their hands on sample silicon for their own development purposes. Nothing surprising in that, given the level of access and involvement needed when you are a company making fully integrated products where even the OS is self made.
Vya DomusCost, they made it pretty much clear that they will use more or less the same SoCs that are present in their phones/tablets, which are cheap to make. It's probably cheaper for them in the long run to make all of the silicon on their own. What many still don't understand is that Apple doesn't need the highest performance, their customers aren't looking for that, otherwise they'd buy something else. What they want is the software, in other words Apple just needs to provide something good enough hardware wise.
Where did you get that from? I sincerely doubt they'll be using anything mobile SoC-like in upcoming iMacs and Mac Pros. Laptops? Sure, current Apple mobile chips are comparable to Intel alternatives in die area and IPC, so as long as they can clock them higher and give them a more PC-like memory interface that should work perfectly fine unless there's some unknown issue causing frequency scaling to stall at low speeds. Anything equivalent to current MBP 16" laptops is bound to get a bigger chip though.
Posted on Reply
#43
Vya Domus
FordGT90ConceptIf that's the case then the Mac Pro's days are numbered. The model available now may be the last.
I mean the Mac Pro always was always in a rocky state, going for years with no updates or refreshes in between releases.
Posted on Reply
#44
Valantar
DemonicRyzen666No game should actually require AVX to run.
Being able to use AVX and needing AVX are not the same - in fact most AVX-enabled applications have fallback code paths to older instruction types. And there is certainly no reason why a game that can make use of AVX for improved performance shouldn't be using it!
Posted on Reply
#45
Vya Domus
ValantarWhere did you get that from? I sincerely doubt they'll be using anything mobile SoC-like in upcoming iMacs and Mac Pros. Laptops? Sure, current Apple mobile chips are comparable to Intel alternatives in die area and IPC, so as long as they can clock them higher and give them a more PC-like memory interface that should work perfectly fine unless there's some unknown issue causing frequency scaling to stall at low speeds. Anything equivalent to current MBP 16" laptops is bound to get a bigger chip though.
They already sent out development kits which have the same chip that is found in the iPad. Watch the WWDC 2020 presentation, to me it's very clear that this is how they want the future of Mac hardware to look like. It's not surprising because I do think that these mobile SoCs are not easily scalable.
Posted on Reply
#46
the54thvoid
Intoxicated Moderator
Very weird that this has appeared on the upcoming feed....

Same story?

Posted on Reply
#47
Valantar
Vya DomusThey already sent out development kits which have the same chip that is found in the iPad. Watch the WWDC 2020 presentation, to me it's very clear that this is how they want the future of Mac hardware to look like. It's not surprising because I do think that these mobile SoCs are not easily scalable.
You misunderstand. That kit is so that developers can adapt their software to the architecture in a non-tablet form factor (as nothing of the sort exists right now, and emulating this on an X86 mac would be silly). The only thing this tells us is that future Macs will be based on the Arm architecture. So yes, they want the future of Mac hardware to "look like" that - like an Arm-based Mac. There is absolutely nothing in this saying that they won't scale higher in power, core counts or other features. It's just what was easiest and cheapest for Apple to cobble together to ensure that they have some working 3rd party software when they launch their first laptops.
Posted on Reply
#48
DemonicRyzen666
ValantarBeing able to use AVX and needing AVX are not the same - in fact most AVX-enabled applications have fallback code paths to older instruction types. And there is certainly no reason why a game that can make use of AVX for improved performance shouldn't be using it!
I think you miss understood. He denied exactly what you said.
edit: I mean the being able to use it part
Posted on Reply
#49
Valantar
FordGT90ConceptIf that's the case then the Mac Pro's days are numbered. The model available now may be the last.
Vya DomusI mean the Mac Pro always was always in a rocky state, going for years with no updates or refreshes in between releases.
It might be, though there's nothing stopping them from making an Arm-based SoC with heaps of cores and PCIe like those server SoCs that are showing up these days. Given that the Mac Pro uses all custom hardware anyway they could just redesign the motherboard around this and keep everything more or less the same. Of course driver support for PCIe devices would be tricky, but it already is for a lot of things on MacOS, so that's not that big of a change.
DemonicRyzen666I think you miss understood. He denied exactly what you said.
Denied what? That there are fallbacks? There is nothing a chip designer can do to prevent this (beyond removing older instruction sets I guess), as that is a pure software thing. Software checks the CPUID, whether it is on the list of "has [high performance instruction set X], if yes, run code path A, if no, run code path B.

What you were describing in your previous post sounds like the opposite of that - the ability to run AVX code on hardware without AVX support. This will not work, as the CPU doesn't understand the instructions and thus can't process them. Sure, there might exist translation layers, emulation and similar workarounds in some cases, but they are rare and inevitably tank performance far worse than writing code for a lower common denominator instruction set. The whole point of added instruction sets like AVX is to add the option to run certain specific operations at a higher performance level than could be done with more general purpose instructions - but you can of course do the same work on more general purpose instructions, just slower and with different code.
Posted on Reply
#50
R-T-B
It actually adds up to me.

We had a defective skylake unit CPU right here on TechPowerup, don't recall the thread. It's darn near the only real true-blue defective CPU I have seen. I can buy this.
Valantar... so it's "fake news" because it is sourced from someone who at the time was perfectly positioned to have access to this information? Yeah, sorry, your logic doesn't hold there. You seem to be implying that former employee = disgruntled former employee, which is nonsense. There is no reason to suspect Piednoel to hold any grudge towards his former employer; he quit on his own volition and has no history of criticizing them previously.

I really wish people would stop abusing the term "fake news".
He says that word, but I don't think he knows what it means.
Posted on Reply
Add your own comment
Jul 31st, 2024 17:16 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts