Friday, December 6th 2019
TSMC on Track to Deliver 3 nm in 2022
TSMC is delivering record results day after day, with a 5 nm manufacturing process starting High Volume Manufacturing (HVM) in Q2 next year, 7 nm process getting plenty of orders and the fact that TSMC just became the biggest company publicly trading in Asia. Continuing with the goal to match or even beat the famous Moore's Law, TSMC is already planning for future 3 nm node manufacturing, promised to start HVM as soon as 2022 arrives, according to JK Wang, TSMC's senior vice president of fab operations. Delivering 3 nm a whole year before originally planned in 2023, TSMC is working hard, with fab construction work doing quite well, judging by all the news that the company is releasing recently.
We can hope to see the first wave of products built using 3 nm manufacturing process sometime around end of year 2022, when the holiday season arrives. Usual customers like Apple and HiSilicon will surely utilize the new node and deliver their smartphones with 3 nm processors inside as soon as the process is ready for HVM.
Source:
DigiTimes
We can hope to see the first wave of products built using 3 nm manufacturing process sometime around end of year 2022, when the holiday season arrives. Usual customers like Apple and HiSilicon will surely utilize the new node and deliver their smartphones with 3 nm processors inside as soon as the process is ready for HVM.
71 Comments on TSMC on Track to Deliver 3 nm in 2022
Just one question, how do you even print those extremely small transistors and links?
In fact, I don't think there's a single technical field out there where ideas or even proof of concepts aren't already ahead of what we can mass-manufacture today.
Keep in mind that when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future. I don't think we are there yet. Either way time will make it happen sooner or later :)
"when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future"
LCD had way more to offer in long term than CRTs which were maxed out basically.
It is hard for any new tech to surpass previous at start but the capabilities with the new one are outstanding in comparison to the older. CRT and LCD is a good example.
What I mean is progress these days always comes with a few drawbacks. I don't expect the transition from Si to whatever will be the case to be any different. Sticking to display tech, going from CRT to LCD, we gained a lot, but lost in response time; going from LCD to Plasma we gained a lot again, but lost big time in power draw... Sometimes we win, sometimes we lose.
Anyway. When the time comes, industry will find the way to move forward and there will be few technologies to choose from (considered promising). Which one will win is hard to anticipate now. Time will reveal everything.
Plasma TVs had their shortcomings. Very power hungly, flicker (both brightness and burn-in management) and burn-in. All of these prevented plasma from being a good monitor technology. OLED is partially following the same track although for somewhat different reasons. Burn-in as such does not seem to be a huge problem, image retention is. Price and viability of large-scale manufacture is the second one - even with all the success of OLED TVs LG is making only two sizes of the panel 55" and 65" (and a little of 77").
LCDs were woeful for a long while compared to CRTs. Yes, cheap bargain bin CRTs were as bad but even at the top end it took years for the LCDs to catch up. CRTs had their flicker but decent models allowed high refresh rates which made it OK, CRT at 100-120Hz was excellent. This took years for LCDs to achieve, especially on something better than a TN panel. CRTs did not have a native resolution which is something even I find it hard to wrap my head around any more after all the time with LCDs. Contrast and reactions speeds of LCDs took years to catch up. If you remember that time, LCDs had huge downsides they had to overcome. Granted, they had size and power efficiency boosting their adoption from get-go. Screen size was also something that started to play a role at around the same time LCDs got good - CRTs topped out at 20-22" (with visible range often smaller) while good LCDs were similar size and gradually cheaper.
[/offtopic] It is not just the ideas, technology is there for a step or two forward. It is simply not economically viable for mass production.
5nm tech demos were done in early 2000s and produced chips demoed in 2015.
I do not remember if 3nm chips have been demoed but 3nm chips were claimed to be manufactured last year or year before that. With newer, smaller semiconductor production process, performance does not have to be frequency and maybe not even density increases at current levels. Power consumption and efficiency are the big ones here. Granted, there are problems with achieving good results with improvements there as well but R&D is ongoing for that.
And, afaik, GaAs is not more efficient to the point of providing the same performance as Si while using a bigger node.
And to sort of answer your question, I don't know what happened to it.
CRT was primarily replaced due to it's weight and power requirements. We did'nt had to have deep desks for a CRT to fit on anymore. A LCD only takes like 1/4th of it's space. A TFT was also way more energy efficient.
TFT's are not a holey saint either. Screens fade over the course of years. Sometimes i think CRT's and i mean premium ones still stand their ground today compared to LCD's/TFT's. But because of the space you need to install one, mweh.
I got an LCD for 10 years now (a TV) and I disagree with your last statement. That depends on the product.