• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASML CTO Expects Post High-NA Lithography to be Prohibitively Costly

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
19,041 (2.52/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
In an interview with Bits & Chips, ASML's CTO Martin van den Brink said that he believes that we might be reaching the end of the road for current semiconductor lithography technology in the not so distant future. However, for the time being, ASML is executing on its roadmap and after EUV, the next step is high-NA or high-numerical aperture and ASML is currently planning to have its first research high-NA scanner ready for a joint R&D venture with Imec in 2023. Assuming everything goes to plan, ASML is then planning on delivering the first R&D machines to its customers in 2024, followed by deliver of the first volume production machines using high-NA sometime in 2025. Van den Brink points out that due to the current supply chain uncertainties could affect the timing, in combination with the fact that ASML has a high demand for its EUV machines and the two technologies share a lot of components.

As such, current orders are the priority and high-NA development might be put on the back burner if need be, or as Van den Brink puts it "today's meal takes priority over tomorrow's." High-NA scanners are expected to be even more power hungry than EUV machines and are as such expected to pull around two Megawatts for the various stages. The next step in the evolution of semiconductor lithography is where ASML is expecting things to get problematic, as what the company is currently calling hyper-NA is expected to be prohibitively costly to manufacture and use. If the cost of hyper-NA grows as fast as we've seen in high-NA, it will pretty much be economically unfeasible," Van den Brink said. ASML is hoping to overcome the cost issues, but for now, the company has a plan for the next decade and things could very well change during that time and remove some of the obstacles that are currently being seen.



View at TechPowerUp Main Site | Source
 
Two megawatts per scanner? :eek:
 
Two megawatts per scanner? :eek:

I mean, there are entire coal plants that have been started up to fund ASIC mining of Bitcoin... no one seems to be gaping a mouth about that, because they are still up and running.
 
Two megawatts per scanner? :eek:
Yeah, it's really quite something.
Finally, Van den Brink doesn’t want to underestimate the complexity of a system that’s larger than a typical transit bus. “It’s a monstrosity. Back in the day, a scanner needed a few hundred kilowatts. For EUV, it’s 1.5 megawatts, primarily because of the light source. We use the same light source for high-NA, but we need an additional 0.5 megawatts for the stages. We use water-cooled copper wires to power them. We push a lot of engineering.”
 
We use water-cooled copper wires to power them.

The future of powering next gen GPU's....
 
Silicon based semiconductors have had a nice run. I expect improvements will continue into the next decade but they will be very incremental at best. Stacking chips might extend life a little more at which point a total paradigm shift (e.g. photonic chips) will be needed to continue technological improvements. IMHO, mid 2030 will see quite a few C-level executives with worried looks on their faces.
 
Rubbish, I routinely build sub-nm transistor for pennies at home. But I keep losing them, being so tiny end everything :P
 
Rubbish, I routinely build sub-nm transistor for pennies at home. But I keep losing them, being so tiny end everything :p
Have you checked your vacuum cleaner?

Also, if each transistor costs "pennies", plural, that doesn't bode well for the cost of complex ICs with billions of them. Even a single penny would arguably be orders of magnitude too expensive. You need to cut your costs, man!

I mean, there are entire coal plants that have been started up to fund ASIC mining of Bitcoin... no one seems to be gaping a mouth about that, because they are still up and running.
Somewhat true, but those are powering tens of thousands of GPUs - warehouses full of them, across huge areas. They aren't powering a single ~5*5*20m box.
 
Have you checked your vacuum cleaner?

Also, if each transistor costs "pennies", plural, that doesn't bode well for the cost of complex ICs with billions of them. Even a single penny would arguably be orders of magnitude too expensive. You need to cut your costs, man!


Somewhat true, but those are powering tens of thousands of GPUs - warehouses full of them, across huge areas. They aren't powering a single ~5*5*20m box.

gpu's don't mine bitcoin. giant ASIC machines do. so you have it wrong entirely. also, either way it doesn't matter... its disgusting...
 
Why not build layered CPUs and GPUs, just like 3D NAND flash memories with their 200+ layers?
Most likely huge power consumption and extremely low yields?
 
gpu's don't mine bitcoin. giant ASIC machines do. so you have it wrong entirely. also, either way it doesn't matter... its disgusting...
Well, for bitcoin you're right, though those coal plants are AFAIK for crypto mining more generally. Miners don't tend to be picky.

Also, ASIC miners aren't "giant", they're the size of a small suitcase at most.
 
Why not build layered CPUs and GPUs, just like 3D NAND flash memories with their 200+ layers?
Most likely huge power consumption and extremely low yields?

Maybe they plan to? they have to milk incremental gains to appease the shareholders. if you came out with your best cpu possible today, why would i invest in your company 4 years from now when the markets are saturated and no other cpu still matters in 4 years?
 
Why not build layered CPUs and GPUs, just like 3D NAND flash memories with their 200+ layers?
Most likely huge power consumption and extremely low yields?
Thermals. NAND has next to no power consumption and heat output, while CPUs and GPUs generate tons of heat. Trapping that heat under layers of more silicon is generally what you would call a bad idea. Even if each layer consumed just a few watts you'd reach uncontrollable thermals very quickly.
 
Have you checked your vacuum cleaner?

Also, if each transistor costs "pennies", plural, that doesn't bode well for the cost of complex ICs with billions of them. Even a single penny would arguably be orders of magnitude too expensive. You need to cut your costs, man!


Somewhat true, but those are powering tens of thousands of GPUs - warehouses full of them, across huge areas. They aren't powering a single ~5*5*20m box.
You mean 2p x 10M transistors = 1 expensive CPU? I figure once you start making them en-masse, you can halve the production cost :P
(damn, I'm on fire today)
 
Why not build layered CPUs and GPUs, just like 3D NAND flash memories with their 200+ layers?
Most likely huge power consumption and extremely low yields?
Because it's hard to get the same performance? Cooling is also on issue, as NAND flash runs ice cold in comparison to CPUs and GPUs.
This is why things like CoWoS are more likely to happen first.
 
Thermals. NAND has next to no power consumption and heat output, while CPUs and GPUs generate tons of heat. Trapping that heat under layers of more silicon is generally what you would call a bad idea. Even if each layer consumed just a few watts you'd reach uncontrollable thermals very quickly.
Liquid cooling on BOTH sides of a CPU/GPU will solve some of those problems. The GPUs are already getting stupidly ridiculous expensive anyways....
 
Liquid cooling on BOTH sides of a CPU/GPU will solve some of those problems. The GPUs are already getting stupidly ridiculous expensive anyways....
Sounds like a great concept. Maybe we could patent a liquid chip substrate? I'm sure having hundreds or thousands of wires running through a cooling liquid is completely fine!

You mean 2p x 10M transistors = 1 expensive CPU? I figure once you start making them en-masse, you can halve the production cost :p
(damn, I'm on fire today)
You might want to pitch this production process to Nvidia, they would no doubt be interested.
 
  • Like
Reactions: bug
Compact Disk x CTO x k1 x Lambda/North America = pretty much be economically unfeasible

wow

who knew?!?! ;)
 
Back
Top