• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

CSPs to Expand into Edge AI, Driving Average NB DRAM Capacity Growth by at Least 7% in 2025

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,175 (2.36/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
TrendForce has observed that in 2024, major CSPs such as Microsoft, Google, Meta, and AWS will continue to be the primary buyers of high-end AI servers, which are crucial for LLM and AI modeling. Following establishing a significant AI training server infrastructure in 2024, these CSPs are expected to actively expand into edge AI in 2025. This expansion will include the development of smaller LLM models and setting up edge AI servers to facilitate AI applications across various sectors, such as manufacturing, finance, healthcare, and business.

Moreover, AI PCs or notebooks share a similar architecture to AI servers, offering substantial computational power and the ability to run smaller LLM and generative AI applications. These devices are anticipated to serve as the final bridge between cloud AI infrastructure and edge AI for small-scale training or inference applications.




Recent developments in AI PC chipsets include Intel's Lunar Lake and AMD's Strix Point, both of which were unveiled at COMPUTEX this month and meet AI PC standards. Despite the expected release of PCs equipped with these SoCs in Q3 and Q4 of this year, respectively, they were the focal point at the event.

Brands such as ASUS and Acer, which launched models with Qualcomm Snapdragon X Elite in May, also introduced models with Lunar Lake and Strix Point during COMPUTEX, with MSI showcasing Lunar Lake and Strix Point models. These new AI PCs are priced between US$1,399 and $1,899.

AI applications are predicted to mature in 2025 and become capable of handling complex tasks, improving user experiences, and boosting productivity. This will cause consumer demand for smarter and more efficient devices to surge, driving the penetration rate of AI notebooks to 20.4% and significantly boosting DRAM content.

TrendForce estimates that the average NB DRAM capacity will grow from 10.5 GB in 2023 to 11.8 GB in 2024, marking a 12% increase. By 2025, as the penetration rate of AI notebooks rises from 1% in 2024 to 20.4%, with each notebook equipped with at least 16 GB in DRAM, the overall average capacity is expected to grow by 0.8 GB, for an increase of at least 7%.

Furthermore, TrendForce notes that the rise of AI notebooks will drive up average NB DRAM capacity and increase the demand for power-efficient, high-frequency memory. In this scenario, LPDDR will have a distinct advantage over DDR—accelerating the trend of the former replacing the latter. For DDR and SO-DIMM solutions prioritizing scalability, switching to LPDDR and LPCAMM options will be viable alternatives beyond LPDDR.

View at TechPowerUp Main Site | Source
 
Top