• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

TSMC on Track to Deliver 3 nm in 2022

Plus, there's this thing where there's no one team responsible for all nodes. Whoever worked on 7nm will not be working on 5nm, but on 3nm instead. 5nm is already being worked on by the team that did 14nm. That's how things are usually set up in this industry.
Somewhat unrelated, but it's also what will most likely make weird for Intel, when their 7nm node will be ready shortly after they figure out 10nm.

And it's going to be a blast about 5 years from now when everybody hits the physical limits of Si and we all get stuck on 2 or 3nm, while fabs shrink various parts of the transistor. If you think 14nm+++++ was embarrassing, just wait for 3nm++++++++++++++ :D
Yeah should be interesting road forward with silicone nearing it's limits. Still so much tech innovation going forward I'm not worried about it. Even with no die shrink nodes left there is still heaps of tech innovation to exploit well into the future for quite some time.
 
So going to Picometers in the next decade? Bring it on. I thought the nanobots were a fantasy, but if the go atomic size, all will be possible.
Just one question, how do you even print those extremely small transistors and links?
 
So going to Picometers in the next decade? Bring it on. I thought the nanobots were a fantasy, but if the go atomic size, all will be possible.
Theoretically possible, but the diameter of a Si atom is already 220pm... 10nm is already less than 50 of them, side by side.
Just one question, how do you even print those extremely small transistors and links?
Lengthy explanation here: https://en.wikipedia.org/wiki/Semiconductor_device_fabrication
 
I don't think any company will go lower than 1nm with silicon. The silicon is reaching its limits but there will be other ideas and technologies to defy Moor's law.
 
I don't think any company will go lower than 1nm with silicon. The silicon is reaching its limits but there will be other ideas and technologies to defy Moor's law.
There already are other ideas (and have been, for decades). They're just not viable yet.
In fact, I don't think there's a single technical field out there where ideas or even proof of concepts aren't already ahead of what we can mass-manufacture today.
 
There already are other ideas (and have been, for decades).
In fact, I don't think there's a single technical field out there where ideas or even proof of concepts aren't already ahead of what we can mass-manufacture today.
The ideas are there but the technology isn't. At least for now.
Keep in mind that when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future. I don't think we are there yet. Either way time will make it happen sooner or later :)
 
The ideas are there but the technology isn't. At least for now.
Keep in mind that when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future. I don't think we are there yet. Either way time will make it happen sooner or later :)
Not in every aspect though. Just look at the CRT->LCD transition.
 
Not in every aspect though. Just look at the CRT->LCD transition.
I know what you mean but you quoted only what rides your bus.
"when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future"
LCD had way more to offer in long term than CRTs which were maxed out basically.

It is hard for any new tech to surpass previous at start but the capabilities with the new one are outstanding in comparison to the older. CRT and LCD is a good example.
 
I know what you mean but you quoted only what rides your bus.
"when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future"
LCD had way more to offer in long term than CRTs which were maxed out basically.

It is hard for any new tech to surpass previous at start but the capabilities with the new one are outstanding in comparison to the older. CRT and LCD is a good example.
Yeah, but it's not guaranteed LCDs will be with us long enough to give us all they can ;)
What I mean is progress these days always comes with a few drawbacks. I don't expect the transition from Si to whatever will be the case to be any different. Sticking to display tech, going from CRT to LCD, we gained a lot, but lost in response time; going from LCD to Plasma we gained a lot again, but lost big time in power draw... Sometimes we win, sometimes we lose.
 
Yeah, but it's not guaranteed LCDs will be with us long enough to give us all they can ;)
What I mean is progress these days always comes with a few drawbacks. I don't expect the transition from Si to whatever will be the case to be any different. Sticking to display tech, going from CRT to LCD, we gained a lot, but lost in response time; going from LCD to Plasma we gained a lot again, but lost big time in power draw... Sometimes we win, sometimes we lose.
I know what you mean again but I think again, the industry went LCD instead of Plasma. If the Plasma TV won and were the most desirable device we would have had Plasma TV's way more efficient. Nobody wanted to pursue plasma advancement when LCD's won already. It just happened that there were to options to replace CRT and LCD is the one that won and is pretty common now. With plasma's, there was also a problem with the high contrast in fast pace games and shadowing that occurred.
Anyway. When the time comes, industry will find the way to move forward and there will be few technologies to choose from (considered promising). Which one will win is hard to anticipate now. Time will reveal everything.
 
Last edited:
  • Like
Reactions: bug
[offtopic]
Plasma TVs had their shortcomings. Very power hungly, flicker (both brightness and burn-in management) and burn-in. All of these prevented plasma from being a good monitor technology. OLED is partially following the same track although for somewhat different reasons. Burn-in as such does not seem to be a huge problem, image retention is. Price and viability of large-scale manufacture is the second one - even with all the success of OLED TVs LG is making only two sizes of the panel 55" and 65" (and a little of 77").

LCDs were woeful for a long while compared to CRTs. Yes, cheap bargain bin CRTs were as bad but even at the top end it took years for the LCDs to catch up. CRTs had their flicker but decent models allowed high refresh rates which made it OK, CRT at 100-120Hz was excellent. This took years for LCDs to achieve, especially on something better than a TN panel. CRTs did not have a native resolution which is something even I find it hard to wrap my head around any more after all the time with LCDs. Contrast and reactions speeds of LCDs took years to catch up. If you remember that time, LCDs had huge downsides they had to overcome. Granted, they had size and power efficiency boosting their adoption from get-go. Screen size was also something that started to play a role at around the same time LCDs got good - CRTs topped out at 20-22" (with visible range often smaller) while good LCDs were similar size and gradually cheaper.
[/offtopic]

The ideas are there but the technology isn't. At least for now.
It is not just the ideas, technology is there for a step or two forward. It is simply not economically viable for mass production.
5nm tech demos were done in early 2000s and produced chips demoed in 2015.
I do not remember if 3nm chips have been demoed but 3nm chips were claimed to be manufactured last year or year before that.

Keep in mind that when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future. I don't think we are there yet. Either way time will make it happen sooner or later :)
With newer, smaller semiconductor production process, performance does not have to be frequency and maybe not even density increases at current levels. Power consumption and efficiency are the big ones here. Granted, there are problems with achieving good results with improvements there as well but R&D is ongoing for that.
 
It is not just the ideas, technology is there for a step or two forward. It is simply not economically viable for mass production.
5nm tech demos were done in early 2000s and produced chips demoed in 2015.
I do not remember if 3nm chips have been demoed but 3nm chips were claimed to be manufactured last year or year before that.
We are not talking about silicon but what can replace it. I don't think any of the technologies we have can replace silicon successfully for CPU production. Not at this point. There's still some potential in the silicon and manufacturers will not resign from it just yet. We are talking about future silicon replacement with other technologies and what comes with that.
With newer, smaller semiconductor production process, performance does not have to be frequency and maybe not even density increases at current levels. Power consumption and efficiency are the big ones here. Granted, there are problems with achieving good results with improvements there as well but R&D is ongoing for that.
Nobody said it is about frequency or density (I assume you are referring to CPU frequency and transistor count?) but performance. Again we are not talking about silicon but other technologies that will replace it eventually.
LCDs were woeful for a long while compared to CRTs. Yes, cheap bargain bin CRTs were as bad but even at the top end it took years for the LCDs to catch up. CRTs had their flicker but decent models allowed high refresh rates which made it OK, CRT at 100-120Hz was excellent. This took years for LCDs to achieve, especially on something better than a TN panel. CRTs did not have a native resolution which is something even I find it hard to wrap my head around any more after all the time with LCDs. Contrast and reactions speeds of LCDs took years to catch up. If you remember that time, LCDs had huge downsides they had to overcome. Granted, they had size and power efficiency boosting their adoption from get-go. Screen size was also something that started to play a role at around the same time LCDs got good - CRTs topped out at 20-22" (with visible range often smaller) while good LCDs were similar size and gradually cheaper.
LCD were cheaper and used half the power CRT used. 24 inch CRT used 120Watts of power, for a LCD you needed to get 42 inch TV and that was at the beginning of the LCD when they were using CCFL (cold cathode fluorescent. Monitors at least used that CCFL) It is lower now. Ask yourself if the CRT back in the days 100hz or 120hz were actually using it? Also what would be with CRTs now? They would have been exactly the same as we left them so many years ago. Look how LCD progressed and there's still more to achieve here. BTW I remember my first 40 inch Sony Bravia 100hz motion flow. I bought it 10 years ago and it still works :) TV not monitor. That is different. It didnt take long for LCD to get to CRT like performance. It was different but It wasn't that different. Besides my TV's image quality was way better when compared to a CRT back then.
 
Last edited:
Fascinating read a little bit on that link and no wonder EUV is such a big deal.
There already are other ideas (and have been, for decades). They're just not viable yet.
In fact, I don't think there's a single technical field out there where ideas or even proof of concepts aren't already ahead of what we can mass-manufacture today.
I think just boils down to which one becomes the next standard that gets adopted much console generations there is are always ones leading and ones trailing. Hell it took quite a long time just for flash memory to really gain traction and now that it has there is little stopping it for the foreseeable future.
 
Whatever happened to the next big thing GaAs, it was being touted 20 years ago as silicons replacement. I guess they kept finding ways to keep silicon going they never anticipated back then. I know some people are touting photonic devices but seriously you can never achieve the size when the wavelenghts used are so much larger and the losses are incredible as you go to smaller wavelengths. You’ll never build a photonic cpu IMO.
 
Since the process node nowadays is just marketing, foundries can keep decreasing the number without increasing the density, or at least just little.
 
Whatever happened to the next big thing GaAs, it was being touted 20 years ago as silicons replacement. I guess they kept finding ways to keep silicon going they never anticipated back then. I know some people are touting photonic devices but seriously you can never achieve the size when the wavelenghts used are so much larger and the losses are incredible as you go to smaller wavelengths. You’ll never build a photonic cpu IMO.
GaAs is a molecule made of two atoms, both larger than Si. Which one do you think scales easier to smaller nodes?
And, afaik, GaAs is not more efficient to the point of providing the same performance as Si while using a bigger node.
And to sort of answer your question, I don't know what happened to it.
 
The future is either quantum or biological it just depends which one can run our legacy code faster, but in the interim we could print to substrates small than silicone it's just more expensive but carbon nanotubes could be a short term solution until we can harness the quark.
 
Carbon nanotubes seems as if it's the most obviously probable step beyond silicone and it'll probably just be used strategically alongside it and very sparsely initially and over time ramp up the ratio of carbon nanotubes to silicone in the fabrication process as cost decreases. Outside of that we've got multiple chip modules and 3D stacking and as others mention these nodes have a lot of room for refinement and optimization if Intel's 14nm ++++++++++++++++++++ has taught us anything.
 
I know what you mean but you quoted only what rides your bus.
"when you transition to a new technology, you need to get same performance of the current product or higher. Maybe a bit lower but promising a lot better in near future"
LCD had way more to offer in long term than CRTs which were maxed out basically.

It is hard for any new tech to surpass previous at start but the capabilities with the new one are outstanding in comparison to the older. CRT and LCD is a good example.

Eh LCD and TFT or whatever had serious issues at the beginning of it's era. Tearing, output lag (like there was a delay in what your GPU spitted out and the moment you got it on screen), there where issues with dead pixels, there where screens that could be screwed over if a "black" or "white" spot was left for too long (burnin) and esp the response time of any LCD which came a long way really. LCD's today are still advertised as < 4MS response time black on black or so, but a CRT would still be superiour over this day compared to a LCD.

CRT was primarily replaced due to it's weight and power requirements. We did'nt had to have deep desks for a CRT to fit on anymore. A LCD only takes like 1/4th of it's space. A TFT was also way more energy efficient.

TFT's are not a holey saint either. Screens fade over the course of years. Sometimes i think CRT's and i mean premium ones still stand their ground today compared to LCD's/TFT's. But because of the space you need to install one, mweh.
 
Eh LCD and TFT or whatever had serious issues at the beginning of it's era. Tearing, output lag (like there was a delay in what your GPU spitted out and the moment you got it on screen), there where issues with dead pixels, there where screens that could be screwed over if a "black" or "white" spot was left for too long (burnin) and esp the response time of any LCD which came a long way really. LCD's today are still advertised as < 4MS response time black on black or so, but a CRT would still be superiour over this day compared to a LCD.

CRT was primarily replaced due to it's weight and power requirements. We did'nt had to have deep desks for a CRT to fit on anymore. A LCD only takes like 1/4th of it's space. A TFT was also way more energy efficient.

TFT's are not a holey saint either. Screens fade over the course of years. Sometimes i think CRT's and i mean premium ones still stand their ground today compared to LCD's/TFT's. But because of the space you need to install one, mweh.
Maybe focus on the entire product and evaluate again if CRT monitor (cause that is what you are referring to even though it is not only gaming involving but ok) are better (would have been better) today in comparison with LCD you can get now in the market.
I got an LCD for 10 years now (a TV) and I disagree with your last statement. That depends on the product.
 
Back
Top