Wednesday, July 17th 2024
Global AI Server Demand Surge Expected to Drive 2024 Market Value to US$187 Billion; Represents 65% of Server Market
TrendForce's latest industry report on AI servers reveals that high demand for advanced AI servers from major CSPs and brand clients is expected to continue in 2024. Meanwhile, TSMC, SK hynix, Samsung, and Micron's gradual production expansion has significantly eased shortages in 2Q24. Consequently, the lead time for NVIDIA's flagship H100 solution has decreased from the previous 40-50 weeks to less than 16 weeks.
TrendForce estimates that AI server shipments in the second quarter will increase by nearly 20% QoQ, and has revised the annual shipment forecast up to 1.67 million units—marking a 41.5% YoY growth.TrendForce notes that this year, major CSPs continue to focus their budgets on procuring AI servers, which is crowding out the growth momentum of general servers. Compared to the high growth rate of AI servers, the annual growth rate of general server shipments is only 1.9%. The share of AI servers in total server shipments is expected to reach 12.2% for an increase of about 3.4 percentage points from 2023.
In terms of market value, AI servers are significantly contributing to revenue growth more than general servers. The market value of AI servers is projected to exceed $187 billion in 2024, with a growth rate of 69%, accounting for 65% of the total server market value.
North American CSPs (e.g., AWS, Meta) are continuously expanding their proprietary ASICs, and Chinese companies like Alibaba, Baidu, and Huawei are actively expanding their own ASIC AI solutions. This is expected to increase the share of ASIC servers in the total AI server market to 26% in 2024, while mainstream GPU-equipped AI servers will account for about 71%.
In terms of AI chip suppliers for AI servers, NVIDIA holds the highest market share—approaching 90% for GPU-equipped AI servers—while AMD's market share is only about 8%. However, when including all AI chips used in AI servers (GPU, ASIC, FPGA), NVIDIA's market share this year is around 64%.
TrendForce observes that demand for advanced AI servers is expected to remain strong through 2025, especially with NVIDIA's next-generation Blackwell (including GB200, B100/B200) set to replace the Hopper platform as the market mainstream. This will also drive demand for CoWoS and HBM. For NVIDIA's B100, the chip size will be double that of the H100, consuming more CoWoS. The production capacity of major supplier TSMC's CoWoS is estimated to reach 550-600K units by the end of 2025, with a growth rate approaching 80%.
Mainstream H100 in 2024 will be equipped with 80 GB HMB3. By 2025, main chips like NVIDIA's Blackwell Ultra or AMD's MI350 are expected to be equipped with up to 288 GB of HBM3e, tripling the unit usage. The overall HBM supply is expected to double by 2025 with the strong ongoing demand in the AI server market.
Source:
TrendForce
TrendForce estimates that AI server shipments in the second quarter will increase by nearly 20% QoQ, and has revised the annual shipment forecast up to 1.67 million units—marking a 41.5% YoY growth.TrendForce notes that this year, major CSPs continue to focus their budgets on procuring AI servers, which is crowding out the growth momentum of general servers. Compared to the high growth rate of AI servers, the annual growth rate of general server shipments is only 1.9%. The share of AI servers in total server shipments is expected to reach 12.2% for an increase of about 3.4 percentage points from 2023.
In terms of market value, AI servers are significantly contributing to revenue growth more than general servers. The market value of AI servers is projected to exceed $187 billion in 2024, with a growth rate of 69%, accounting for 65% of the total server market value.
North American CSPs (e.g., AWS, Meta) are continuously expanding their proprietary ASICs, and Chinese companies like Alibaba, Baidu, and Huawei are actively expanding their own ASIC AI solutions. This is expected to increase the share of ASIC servers in the total AI server market to 26% in 2024, while mainstream GPU-equipped AI servers will account for about 71%.
In terms of AI chip suppliers for AI servers, NVIDIA holds the highest market share—approaching 90% for GPU-equipped AI servers—while AMD's market share is only about 8%. However, when including all AI chips used in AI servers (GPU, ASIC, FPGA), NVIDIA's market share this year is around 64%.
TrendForce observes that demand for advanced AI servers is expected to remain strong through 2025, especially with NVIDIA's next-generation Blackwell (including GB200, B100/B200) set to replace the Hopper platform as the market mainstream. This will also drive demand for CoWoS and HBM. For NVIDIA's B100, the chip size will be double that of the H100, consuming more CoWoS. The production capacity of major supplier TSMC's CoWoS is estimated to reach 550-600K units by the end of 2025, with a growth rate approaching 80%.
Mainstream H100 in 2024 will be equipped with 80 GB HMB3. By 2025, main chips like NVIDIA's Blackwell Ultra or AMD's MI350 are expected to be equipped with up to 288 GB of HBM3e, tripling the unit usage. The overall HBM supply is expected to double by 2025 with the strong ongoing demand in the AI server market.
7 Comments on Global AI Server Demand Surge Expected to Drive 2024 Market Value to US$187 Billion; Represents 65% of Server Market
This doesn't take export restrictions into account of course.
I don’t think the two are analogous.
So I think there's real possibility investors will realize AI isn't as useful as it was sold to be, and that throwing money at solutions doesn't magically make hallucinations, other AI problems go away... And the response could be severe, non-tech investors tend to overreact when it comes to technology.