Friday, May 19th 2023
Artificial Intelligence Helped Tape Out More than 200 Chips
In its recent Second Quarter of the Fiscal Year 2023 conference, Synopsys issued interesting information about the recent moves of chip developers and their usage of artificial intelligence. As the call notes, over 200+ chips have been taped out using Synopsys DSO.ai place-and-route (PnR) tool, making it a successful commercially proven AI chip design tool. The DSO.ai uses AI to optimize the placement and routing of the chip's transistors so that the layout is compact and efficient with regard to the strict timing constraints of the modern chip. According to Aart J. de Geus, CEO of Synopsys, "By the end of 2022, adoption, including 9 of the top 10 semiconductor vendors have moved forward at great speed with 100 AI-driven commercial tape-outs. Today, the tally is well over 200 and continues to increase at a very fast clip as the industry broadly adopts AI for design from Synopsys."
This is an interesting fact that means that customers are seeing the benefits of AI-assisted tools like DSO.ai. However, the company is not stopping there, and a whole suite of tools is getting an AI makeover. "We unveiled the industry's first full-stack AI-driven EDA suite, sydnopsys.ai," noted the CEO, adding that "Specifically, in parallel to second-generation advances in DSO.ai we announced VSO.ai, which stands for verification space optimization; and TSO.ai, test space optimization. In addition, we are extending AI across the design stack to include analog design and manufacturing." Synopsys' partners in this include NVIDIA, TSMC, MediaTek, Renesas, and IBM Research, all of which used AI-assisted tools for chip design efforts. A much wider range of industry players is expected to adopt these tools as chip design costs continue to soar as we scale the nodes down. With future 3 nm GPU costing an estimated $1.5 billion, 40% of that will account for software, and Synopsys plans to take a cut in that percentage.
Sources:
Synopsys Call, via Tom's Hardware
This is an interesting fact that means that customers are seeing the benefits of AI-assisted tools like DSO.ai. However, the company is not stopping there, and a whole suite of tools is getting an AI makeover. "We unveiled the industry's first full-stack AI-driven EDA suite, sydnopsys.ai," noted the CEO, adding that "Specifically, in parallel to second-generation advances in DSO.ai we announced VSO.ai, which stands for verification space optimization; and TSO.ai, test space optimization. In addition, we are extending AI across the design stack to include analog design and manufacturing." Synopsys' partners in this include NVIDIA, TSMC, MediaTek, Renesas, and IBM Research, all of which used AI-assisted tools for chip design efforts. A much wider range of industry players is expected to adopt these tools as chip design costs continue to soar as we scale the nodes down. With future 3 nm GPU costing an estimated $1.5 billion, 40% of that will account for software, and Synopsys plans to take a cut in that percentage.
14 Comments on Artificial Intelligence Helped Tape Out More than 200 Chips
"Robots building robots? Now that's just stupid."
You have to go back to the Seventies to find computer scientists using slide rules, pencils and notepads to design computers, visionaries like Gene Amdahl (RIP).
Newer machine learning algorithms are better optimized for processing the large number of calculations for today's semiconductor designs but there's nothing new about using computer chips to design computer chips.
You can even invest in companies that create products to design and manufacture semiconductors like Cadence Design, Applied Materials, ASML, etc.
Capitalism has built in slowing factors to milk max profit. Intel and AMD are a Duopoly, and will continue to throw punches at each other that only do light damage, they probably could have done heavy damage one way or another by releasing some node way ahead of schedule, but that means less sells in the long term, cause eventually the node gets so small they no longer have anywhere to go.
Nations have been working on scramjet propulsion for decades, there is also the issue of having the flesh bags survive speeds of mach 10 never mind having them do maneuvers which generate g's that will make blood squirt out of your toes.
Side-effect is; you might start to get CPU's with functions nobody perhaps understand, or it goes beyond our understanding.
It's dangerous.
Things are not infinite, diminishing returns happen. You see it everywhere. And there is even a point where the investment exceeds the benefits, look at fossil & climate.
What we're really good at is incrementally expanding our toolset to get a step further, but we're also seeing that every step forward is becoming a harder, bigger one to make. Look at chip nodes! Its more visible there than anywhere else. The problem behind that is that we've built societies and economies/systems on the idea we can keep going forward, ever bigger. At the same time, its only natural that we want a complete understanding of all the things, being who we are.
I think also that we don't innovate less - we innovate more, but the innovations themselves are most of the time of a more dubious nature. Hit or miss, if you will, and the factors that influence that aren't always factors of merit of the innovation itself, but more of the environment, mindset, time, etc around it. Even the tremendous speed at which all these small innovations follow up on each other is a factor we can't deny. There simply isn't enough time to consume all the info. A great example of that is the LHC. No innovation you say...? This project puts the finger on it perfectly: we're trying to find the last puzzle pieces, and its damn hard.