T0@st
News Editor
- Joined
- Mar 7, 2023
- Messages
- 2,567 (3.52/day)
- Location
- South East, UK
System Name | The TPU Typewriter |
---|---|
Processor | AMD Ryzen 5 5600 (non-X) |
Motherboard | GIGABYTE B550M DS3H Micro ATX |
Cooling | DeepCool AS500 |
Memory | Kingston Fury Renegade RGB 32 GB (2 x 16 GB) DDR4-3600 CL16 |
Video Card(s) | PowerColor Radeon RX 7800 XT 16 GB Hellhound OC |
Storage | Samsung 980 Pro 1 TB M.2-2280 PCIe 4.0 X4 NVME SSD |
Display(s) | Lenovo Legion Y27q-20 27" QHD IPS monitor |
Case | GameMax Spark M-ATX (re-badged Jonsbo D30) |
Audio Device(s) | FiiO K7 Desktop DAC/Amp + Philips Fidelio X3 headphones, or ARTTI T10 Planar IEMs |
Power Supply | ADATA XPG CORE Reactor 650 W 80+ Gold ATX |
Mouse | Roccat Kone Pro Air |
Keyboard | Cooler Master MasterKeys Pro L |
Software | Windows 10 64-bit Home Edition |
Microsoft will roll out "NPU-optimized versions of DeepSeek-R1" directly to Copilot+ PCs—yesterday's announcement revealed that Qualcomm Snapdragon X-equipped systems will be first in line to receive support. Owners of devices—that harbor Intel Core Ultra 200V "Lunar Lake" processors—will have to wait a little longer, and reports suggest that AMD Ryzen AI 9 HX-based Copilot+ PCs will be third in Microsoft's queue. Interestingly, Team Red has published DeepSeek R1 model-related guides for Radeon RX graphics cards and Ryzen AI processors. Starting off, Microsoft's first release will be based on DeepSeek-R1-Distill-Qwen-1.5B—made available in AI Toolkit. Teased future updates will be 7B and 14B variants. These are expected to "arrive soon."
Microsoft reckons that the optimized models will: "let developers build and deploy AI-powered applications that run efficiently on-device, taking full advantage of the powerful Neural Processing Unit (NPUs) in Copilot+ PCs. The on-board AI-crunching solution is advertised as a tool for empowerment—allowing: "developers to tap into powerful reasoning engines to build proactive and sustained experiences. With our work on Phi Silica, we were able to harness highly efficient inferencing—delivering very competitive time to first token and throughput rates, while minimally impacting battery life and consumption of PC resources." Western companies appear to be participating in a race to swiftly adopt DeepSeek's open source model, due to apparent cost benefits. Certain North American organizations have disclosed their own views and reservations, but others will happily pay less for a potent alternative to locally-developed systems. In a separate bulletin (also posted on January 29), Microsoft's AI platform team revealed that a cloud-hosted DeepSeek R1 model is available on Azure AI Foundry and GitHub.
View at TechPowerUp Main Site | Source
Microsoft reckons that the optimized models will: "let developers build and deploy AI-powered applications that run efficiently on-device, taking full advantage of the powerful Neural Processing Unit (NPUs) in Copilot+ PCs. The on-board AI-crunching solution is advertised as a tool for empowerment—allowing: "developers to tap into powerful reasoning engines to build proactive and sustained experiences. With our work on Phi Silica, we were able to harness highly efficient inferencing—delivering very competitive time to first token and throughput rates, while minimally impacting battery life and consumption of PC resources." Western companies appear to be participating in a race to swiftly adopt DeepSeek's open source model, due to apparent cost benefits. Certain North American organizations have disclosed their own views and reservations, but others will happily pay less for a potent alternative to locally-developed systems. In a separate bulletin (also posted on January 29), Microsoft's AI platform team revealed that a cloud-hosted DeepSeek R1 model is available on Azure AI Foundry and GitHub.


View at TechPowerUp Main Site | Source