- Joined
- Jan 28, 2020
- Messages
- 4,670 (2.61/day)
- Location
- Ex-usa | slava the trolls
More importantly, there is a very good reason why new bleeding edge processes usually start out with Apple and their low-power SOCs - initially these processes are just not a good fit for power-hungry desktop parts and the yields would be abysmal. Apple is essentially both a test run and a stabilizer. Even if NV or AMD COULD buy some 2nm allocation there is no way they would WANT to. I think ARF thinks that new nodes are magic that just by itself makes any chip design better and works OOB. It's not.
Some time ago, it was AMD who led Apple.
AMD released the first 40nm GPU back in Q2 2009 (RV740 aka Radeon HD 4770), while Apple released a 45nm (at Samsung) A4 in 2010.
You probably forgot that it was AMD who always used the state-of-the-art manufacturing node first..