Friday, August 30th 2024
Microsoft Unveils New Details on Maia 100, Its First Custom AI Chip
Microsoft provided a detailed view of Maia 100 at Hot Chips 2024, their initial specialized AI chip. This new system is designed to work seamlessly from start to finish, with the goal of improving performance and reducing expenses. It includes specially made server boards, unique racks, and a software system focused on increasing the effectiveness and strength of sophisticated AI services, such as Azure OpenAI. Microsoft introduced Maia at Ignite 2023, sharing that they had created their own AI accelerator chip. More information was provided earlier this year at the Build developer event. The Maia 100 is one of the biggest processors made using TSMC's 5 nm technology, designed for handling extensive AI tasks on Azure platform.
Maia 100 SoC architecture features:
Maia 100 Specs:
Sources:
ServeTheHome, Microsoft Blog
Maia 100 SoC architecture features:
- A high-speed tensor unit (16xRx16) offers rapid processing for training and inferencing while supporting a wide range of data types, including low precision data types such as the MX data format, first introduced by Microsoft through the MX Consortium in 2023.
- The vector processor is a loosely coupled superscalar engine built with custom instruction set architecture (ISA) to support a wide range of data types, including FP32 and BF16.
- A Direct Memory Access (DMA) engine supports different tensor sharding schemes.
- Hardware semaphores enable asynchronous programming on the Maia system.
Maia 100 Specs:
- Chip Size: 820 mm²
- Design to TDP: 700 W
- Provision TDP: 500 W
- Packaging: TSMC N5 process with COWOS-S interposer technology
- HBM BW/Cap: 1.8 TB/s @ 64 GB HBM2E
- Peak Dense Tensor POPS: 6bit: 3, 9bit: 1.5, BF16: 0.8
- L1/L2: 500 MB
- Backend Network BW: 600 GB/s (12X400gbe)
- Host BW (PCIe): 32 GB/s PCIe Gen5X8
5 Comments on Microsoft Unveils New Details on Maia 100, Its First Custom AI Chip