Microsoft Working on Custom AI Processor Codenamed Project Athena
According to The Information, Microsoft has been working on creating custom processors for processing AI with a project codenamed Athena. Based on TSMC's 5 nm process, these chips are designed to accelerate AI workloads and scale to hundreds or even thousands of chips. With the boom of Large Language Models (LLMs) that require billions of parameters, training them requires a rapid increase of computational power to a point where companies purchase hundreds of thousands of GPUs from the likes of NVIDIA. However, creating custom processors is a familiar feat for a company like Microsoft. Hyperscalers like AWS, Google, and Meta are already invested in the creation of processors for AI training, and Microsoft is just joining as well.
While we don't have much information about these processors, we know that Microsoft started the project in 2019, and today these processors are in the hands of select employees of Microsoft and OpenAI that work with AI projects and need computational horsepower. Interestingly, some projections assume that if Microsoft could match NVIDIA's GPU performance, the cost would only be a third of NVIDIA's offerings. However, it is challenging to predict that until more information is provided. Microsoft plans to make these chips more widely available as early as next year; however, there is no specific information on when and how, but Azure cloud customers would be the most logical place to start.
While we don't have much information about these processors, we know that Microsoft started the project in 2019, and today these processors are in the hands of select employees of Microsoft and OpenAI that work with AI projects and need computational horsepower. Interestingly, some projections assume that if Microsoft could match NVIDIA's GPU performance, the cost would only be a third of NVIDIA's offerings. However, it is challenging to predict that until more information is provided. Microsoft plans to make these chips more widely available as early as next year; however, there is no specific information on when and how, but Azure cloud customers would be the most logical place to start.