Friday, April 22nd 2022

Apple, Intel to Become Alpha Customers for TSMC's 2 nm Manufacturing Node

Industry reports and sources in the financial community have placed Apple and Intel as the two premier customers for TSMC's upcoming N2 node. N2, which is expected to enter volume production by the end of 2025, will be TSMC's first manufacturing process making use of GAAFET (Gate-All-Around Field-Effect Transistor) design. If there are no significant market upheavals or unexpected snags in technology transition, TSMC will be late to the GAAFET party, following Samsung's 3GAE node in 2023 and Intel's first Angstrom-era process, Intel 20A, in 2024.

While Apple's uptake on TSMC's latest manufacturing technology is practically a given at this point, the fact that Intel too is taking up TSMC's N2 node showcases the company's evolved business tactics after the introduction of its IDM 2.0 strategy (IDM standing for Integrated Device Manufacturer, meaning Intel too will fabricate chips according to clients' specs). While pre-Pat Gelsinger was seemingly scared of touching any other foundries' products - mostly from the fact that Intel does have its own significant manufacturing capabilities and R&D, after all - the new Intel is clearly more at peace with driving its competitor's revenues.
As there's a significant cost in adopting a new manufacturing node, Apple is especially primed to take advantage of technological innovations due to the fact that it sells complete systems, which allows it to increase margins on other hardware elements to make up for the significant chip manufacturing costs. While Intel itself doesn't enjoy that advantage, it's expected that the company will leverage TSMC's N2 mode for its own SoCs and Lunar Lake GPU tiles, which the company placed on its roadmap with clear intention of using post-N3 manufacturing tech.
Source: Tom's Hardware
Add your own comment

64 Comments on Apple, Intel to Become Alpha Customers for TSMC's 2 nm Manufacturing Node

#52
Vya Domus
TiggerTime for AMD to stop scratching its arse and spend some of the RYZEN cash on its own FAB/s But i guess even after all the success they still don't have enough money so are perpetually stuck relying on TSMC, they didn't even come up with 3D cache themselves, it was TSMC that did.
Trying to make your own cutting edge manufacturing process these days is complete and utter suicide. There is no way to compete, even if you have billions of dollars to spend and it's not even all about money, the IP alone that would be required makes it an impossible task.

Many of these companies used to have their own fabs, including AMD. Now they don't, doesn't that tell you anything ?
Posted on Reply
#53
ARF
Vayra86This is pulling wool over your eyes. "If I dont put serious load on it, it might use less than some AMD CPUs" lol. Not very objective is it.

There is no anti Intel sentiment here, only people calling things what they are. I remember an Intel Quadcore era where most people shat all over inefficient Bulldozers and Piledrivers and I also remember AMD getting flak for high power usage on GPU. I was one of those, just like I shit over ADLs 241W ridiculousness.

The constant in all of this is: performance per watt appears to be very highly valued by a large group of customers. And I totally get that. Look at the sentiment towards Ampere and incoming Lovelace announcing 450-600W potential TDPs!!

Id suggest you stop seeing the good old green red conflict and look at the market instead.
Those Lovelace TDPs are very high. Many users will be put towards new PSUs or not buying at all.
Posted on Reply
#54
Unregistered
Vya DomusTrying to make your own cutting edge manufacturing process these days is complete and utter suicide. There is no way to compete, even if you have billions of dollars to spend and it's not even all about money, the IP alone that would be required makes it an impossible task.

Many of these companies used to have their own fabs, including AMD. Now they don't, doesn't that tell you anything ?
The rest gave up, Intel did not.
Posted on Edit | Reply
#55
unknownk
Nobody can afford cutting edge fabs except Intel and Apple if they wanted to (they don't).

AMD went to shit mostly because their fabs were too expensive to keep being cutting edge. They spun off their fabs into Global Foundries which even with outside orders couldn't make enough money to keep up with technology. AMD spent a ton of resources to design their ZEN CPU on 12nm Global Foundries process (which they had to use because of contracts made during the spinoff) just to almost cache up to Intel. It wasn't until ZEN 2 and the switch to chiplets plus going to TSMC that things picked up for them.

With Intel launch of their new GPU you will see that even with a million engineers and using the same process nodes as AMD and NVIDIA Intel sucks. Intel has always been known for their process ability and not because of their designs which is how small companies like Cyrix and AMD leadfrogged them from time to time on small budgets. Lose the process edge and Intel goes to shit.
Posted on Reply
#56
ARF
The question is who will make the future cutting edge chips if nobody can afford it? :confused:
Posted on Reply
#58
ARF
It won't survive alone fighting with itself. The regulators will start legal procedures against monopoly situation.
Posted on Reply
#59
Vayra86
ARFThe question is who will make the future cutting edge chips if nobody can afford it? :confused:
I think its slowly becoming that time when humans oughta realize and make policy on the fact that all good things come to an end and the idea of continued growth (and shrink in the case of chips) is finite.
Posted on Reply
#60
unknownk
To be honest if everybody has to use 3rd party foundries to make chips you will probably see more companies joining the CPU and GPU business since it is much cheaper to design a CPU/GPU then to make them yourself (still expensive but not nearly as much). Of course moving from a design monopoly to a manufacturing one has its own problems.
Posted on Reply
#61
ARF
Vayra86I think its slowly becoming that time when humans oughta realize and make policy on the fact that all good things come to an end and the idea of continued growth (and shrink in the case of chips) is finite.
Well, I want to be an optimist and say that the good things haven't even started yet.
To me, the good things will start once everyone and their dog start using 4K 3840x2160 monitors everywhere ;)
Posted on Reply
#63
R0H1T
TiggerThe rest gave up, Intel did not.
They did, you just won't admit it though! IDM 2.0 or whatever is proof they can't do it alone.

And I'm not even counting the 22nm-7nm missteps o_O
Posted on Reply
#64
Unregistered
R0H1TThey did, you just won't admit it though! IDM 2.0 or whatever is proof they can't do it alone.

And I'm not even counting the 22nm-7nm missteps o_O
At least they still have them whether they are still working out for them or not. I'n sure they are making more than desktop CPU's with all the FABS they have, so must be making some profit for them. Pretty sure they don't just use them to make CPU's for us plebs
Add your own comment
Jan 10th, 2025 22:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts