- Joined
- Oct 9, 2007
- Messages
- 47,194 (7.56/day)
- Location
- Hyderabad, India
System Name | RBMK-1000 |
---|---|
Processor | AMD Ryzen 7 5700G |
Motherboard | ASUS ROG Strix B450-E Gaming |
Cooling | DeepCool Gammax L240 V2 |
Memory | 2x 8GB G.Skill Sniper X |
Video Card(s) | Palit GeForce RTX 2080 SUPER GameRock |
Storage | Western Digital Black NVMe 512GB |
Display(s) | BenQ 1440p 60 Hz 27-inch |
Case | Corsair Carbide 100R |
Audio Device(s) | ASUS SupremeFX S1220A |
Power Supply | Cooler Master MWE Gold 650W |
Mouse | ASUS ROG Strix Impact |
Keyboard | Gamdias Hermes E2 |
Software | Windows 11 Pro |
Microsoft has reportedly set 16 GB as the minimum system requirement for AI PCs, a TrendForce market research report finds. To say that Microsoft has a pivotal role to play in PC hardware specs is an understatement. This year sees the introduction of the first "AI PCs," or PCs with on-device AI acceleration for several new features native to Windows 11 23H2, mainly Microsoft Copilot. From the looks of it, Copilot is receiving the highest corporate attention from Microsoft, as the company looks to integrate the AI chatbot that automates and generates work, into the mainstream PC. In fact, Microsoft is even pushing for a dedicated Copilot button on PC keyboards along the lines of the key that brings up the Start menu. The company's biggest move with Copilot will be the 2024 introduction of Copilot Pro, an AI assistant integrated with Office and 365, which the company plans to sell on a subscription basis alone.
Besides cloud-based acceleration, Microsoft's various AI features will rely on some basic hardware specs for local acceleration. One of them of course is the NPU, with Intel's AI Boost and AMD's Ryzen AI being introduced with their latest mobile processors. The other requirement will be memory. AI acceleration is a highly memory sensitive operation, and LLMs require a sizable amount of fast frequent-access memory. So Microsoft arrived at 16 GB as the bare minimum amount of memory for not just native acceleration, but also cloud-based Copilot AI features to work. This should see the notebooks of 2024 set 16 GB as their baseline memory specs; and for commercial notebooks to scale up to 32 GB or even 64 GB, depending on organizational requirements. The development bodes particularly well for the DRAM industry.
View at TechPowerUp Main Site | Source
Besides cloud-based acceleration, Microsoft's various AI features will rely on some basic hardware specs for local acceleration. One of them of course is the NPU, with Intel's AI Boost and AMD's Ryzen AI being introduced with their latest mobile processors. The other requirement will be memory. AI acceleration is a highly memory sensitive operation, and LLMs require a sizable amount of fast frequent-access memory. So Microsoft arrived at 16 GB as the bare minimum amount of memory for not just native acceleration, but also cloud-based Copilot AI features to work. This should see the notebooks of 2024 set 16 GB as their baseline memory specs; and for commercial notebooks to scale up to 32 GB or even 64 GB, depending on organizational requirements. The development bodes particularly well for the DRAM industry.
View at TechPowerUp Main Site | Source