Tuesday, February 11th 2025
![AMD](https://tpucdn.com/images/news/amd-v1739475473466.png)
AMD Reiterates Belief that 2025 is the Year of the AI PC
AI PC capabilities have evolved rapidly in the two years since AMD introduced the first x86 AI PC CPUs at CES 2023. New neural processing units have debuted, pushing available performance from a peak of 10 AI TOPS at the launch of the AMD Ryzen 7 7840U processor to peak 50+ TOPS on the latest AMD Ryzen AI Max PRO 300 Series processors. A wide range of software and hardware companies have announced various AI development plans or brought AI-infused products to market, while major operating system vendors like Microsoft are actively working to integrate AI into the operating system via its Copilot+ PC capabilities. AMD is on the forefront of those efforts and is working closely with Microsoft to deliver Copilot+ for Ryzen AI and Ryzen AI PRO PCs.
In the report "The Year of the AI PC is 2025," Forrester lays out its argument for why this year is likely to bring significant changes for AI PCs. Forrester defines the term "AI PC" to mean any system "embedded with an AI chip and algorithms specifically designed to improve the experience of AI workloads across the computer processing unit (CPU), graphics processing unit (GPU), and neural processing unit (NPU)." This includes AMD products, as well as competing products made by both x86 and non-x86 CPU manufacturers. 2025 represents a turning point for these efforts, both in terms of hardware and software, and this Forrester report is an excellent deep dive into why AI PCs represent the future for enterprise computing.Why 2025 is the Year of the AI PC
First, and arguably most prosaically, 2025 is the year of the AI PC because CPU designers like AMD are baking dedicated AI processing into new products just as Windows 10 is approaching the formal end of support. While consumers and businesses will have the option to buy extra years of security updates, October 14, 2025 is the end of free support and the likely transition date to Windows 11 for millions of companies worldwide that don't want to pay an increasingly large yearly premium for the privilege of running an older operating system. Many of the Windows 11 commercial PCs purchased in the next 12 months are going to support AI via dedicated NPU hardware and will therefore qualify as AI PCs. According to IDC, 93.9% of commercial PCs will be AI PCs by 2028.
The advantage of this distribution model is that customers and early adopters are not being asked to sacrifice the CPU and GPU efficiency improvements they would otherwise expect from new systems in exchange for AI compatibility. The AI PC platforms AMD launched throughout 2024 and into 2025, including the AMD Ryzen AI PRO 300 Series and the Ryzen AI Max PRO Series feature improvements like new CPU cores based on the "Zen 5" microarchitecture, AMD RDNA 3.5 GPU cores, and AMD PRO Technologies to provide security features, manageability tools, and enterprise-grade stability and reliability. These advances arrived alongside the NPU and its dedicated AI processing capability, ensuring AMD customers continue to benefit from new systems in the legacy non-AI workloads they already use.
Second, AI PCs offer a flexible platform that executes emerging AI workloads across the CPU, GPU, and NPU, depending on where it will run best. It's not hard to see the parallels between NPU development and the GPU-powered 3D revolution that kicked off back in the mid-1990s.While specialized 3D graphics initially required the use of discrete graphics accelerator cards, this hardware eventually moved on-die and was integrated directly alongside the CPU. This allowed AMD to dramatically improve the performance and capability of the integrated graphics processor, or iGPU, without driving up cost. Many applications now rely on the iGPU for tasks that previously required a discrete card, and graphics capabilities have become an integral part of a CPU's overall value proposition. Adding an NPU to the CPU and steadily increasing its performance is in line with this innovation. Including an NPU gives developers a dedicated platform for running AI workloads, one that's tightly knitted to the CPU and that can be expected to steadily improve over time.
Today, AI PCs can already offload background tasks that would otherwise require CPU or GPU processing time. More advanced use cases are rapidly proliferating. AMD introduced the first x86 AI PCs in 2023. Two years later, the company's CES 2025 exhibition showcased multiple practical demos. This rapid adoption highlights the NPU's potential efficiency and strength as a workload execution environment compared to the more traditional CPU or GPU.
While CES is largely a consumer-focused show, the broad discussion of AI across both consumer and commercial workloads shows the appeal of integrating artificial intelligence across the entire software market. According to its report, Forrester expects AI and AI PCs to take significant strides towards wide deployment over the next 12 months. This will happen thanks to the Windows 11 refresh cycle, wider AI incorporation across the software industry, and because the increased availability of AI PCs will give companies a cost-effective platform for less intense local AI workloads.
What all of this adds up to is an exciting time for enterprise computing. AI PCs have the potential to revolutionize the industry, and AMD offers a complete lineup for all end-user needs that gives companies extensive options to evaluate how they want to engage with that opportunity.
Source:
AMD Community Blog
In the report "The Year of the AI PC is 2025," Forrester lays out its argument for why this year is likely to bring significant changes for AI PCs. Forrester defines the term "AI PC" to mean any system "embedded with an AI chip and algorithms specifically designed to improve the experience of AI workloads across the computer processing unit (CPU), graphics processing unit (GPU), and neural processing unit (NPU)." This includes AMD products, as well as competing products made by both x86 and non-x86 CPU manufacturers. 2025 represents a turning point for these efforts, both in terms of hardware and software, and this Forrester report is an excellent deep dive into why AI PCs represent the future for enterprise computing.Why 2025 is the Year of the AI PC
First, and arguably most prosaically, 2025 is the year of the AI PC because CPU designers like AMD are baking dedicated AI processing into new products just as Windows 10 is approaching the formal end of support. While consumers and businesses will have the option to buy extra years of security updates, October 14, 2025 is the end of free support and the likely transition date to Windows 11 for millions of companies worldwide that don't want to pay an increasingly large yearly premium for the privilege of running an older operating system. Many of the Windows 11 commercial PCs purchased in the next 12 months are going to support AI via dedicated NPU hardware and will therefore qualify as AI PCs. According to IDC, 93.9% of commercial PCs will be AI PCs by 2028.
The advantage of this distribution model is that customers and early adopters are not being asked to sacrifice the CPU and GPU efficiency improvements they would otherwise expect from new systems in exchange for AI compatibility. The AI PC platforms AMD launched throughout 2024 and into 2025, including the AMD Ryzen AI PRO 300 Series and the Ryzen AI Max PRO Series feature improvements like new CPU cores based on the "Zen 5" microarchitecture, AMD RDNA 3.5 GPU cores, and AMD PRO Technologies to provide security features, manageability tools, and enterprise-grade stability and reliability. These advances arrived alongside the NPU and its dedicated AI processing capability, ensuring AMD customers continue to benefit from new systems in the legacy non-AI workloads they already use.
Second, AI PCs offer a flexible platform that executes emerging AI workloads across the CPU, GPU, and NPU, depending on where it will run best. It's not hard to see the parallels between NPU development and the GPU-powered 3D revolution that kicked off back in the mid-1990s.While specialized 3D graphics initially required the use of discrete graphics accelerator cards, this hardware eventually moved on-die and was integrated directly alongside the CPU. This allowed AMD to dramatically improve the performance and capability of the integrated graphics processor, or iGPU, without driving up cost. Many applications now rely on the iGPU for tasks that previously required a discrete card, and graphics capabilities have become an integral part of a CPU's overall value proposition. Adding an NPU to the CPU and steadily increasing its performance is in line with this innovation. Including an NPU gives developers a dedicated platform for running AI workloads, one that's tightly knitted to the CPU and that can be expected to steadily improve over time.
Today, AI PCs can already offload background tasks that would otherwise require CPU or GPU processing time. More advanced use cases are rapidly proliferating. AMD introduced the first x86 AI PCs in 2023. Two years later, the company's CES 2025 exhibition showcased multiple practical demos. This rapid adoption highlights the NPU's potential efficiency and strength as a workload execution environment compared to the more traditional CPU or GPU.
While CES is largely a consumer-focused show, the broad discussion of AI across both consumer and commercial workloads shows the appeal of integrating artificial intelligence across the entire software market. According to its report, Forrester expects AI and AI PCs to take significant strides towards wide deployment over the next 12 months. This will happen thanks to the Windows 11 refresh cycle, wider AI incorporation across the software industry, and because the increased availability of AI PCs will give companies a cost-effective platform for less intense local AI workloads.
What all of this adds up to is an exciting time for enterprise computing. AI PCs have the potential to revolutionize the industry, and AMD offers a complete lineup for all end-user needs that gives companies extensive options to evaluate how they want to engage with that opportunity.
43 Comments on AMD Reiterates Belief that 2025 is the Year of the AI PC
Their entire naming scheme for everything this year involves cramming assloads of AI into every nook and cranny where it will fit. They bought a whole ass company for their "XDNA" fixed function hardware to drive AI.
They've bet maybe not the whole farm, but everything but the Donkey on AI.
They're not going to turn around and say "Lol, whups! The return on investment just doesn't look like its there..."
Puerto Rico
AI AI AI AI AI AI
Puerto Rico
(Vaya Von Dios - Puerto Rico)
Some people never do
Funny thing is, no matter how hard people want it, their utopia never happens, and that goes for all sides of every discussion.
That goes for AI... it goes for free speech. Its always going to be imperfect, FUBAR, and yet we'll keep trying to use it.
I guess we just have to follow AMD and 'keep believing' it can be better :)
Offline and lower latency is more about on device AI processing rather than cloud processing.
Some of the stuff listed here can be done more efficiently with AI acceleration, especially on a laptop. You really don't want those stuff to be pure GPU/CPU task it's going to drain your battery. Phones have been using AI acceleration for those thing long before AI became the buzzword that it is now. The NPU has always been about efficiency rather than sheer speed. It also allow you to keep your GPU/CPU compute ressources for other stuff.
@csendesmark You can laugh if you want, but that's absolutely the rational behind the NPU. Things like live translation, digital animated avatars, eye contact correction or auto framing are not just computed once, it's a sustained workload, wich is why phones and laptops are the priority, when the desktop doesn't really need and NPU. Machine learning is also better at handling dynamic input data compared to a regular algorithm. Even cameras like the EOS 1DX MK III or the Sony A1 II are making use of ML for enhanced subject recognition and autofocus and have on board ML acceleration.
Which ones tho?
On this:
Image generation is a totally different thing you brought up, for that an NPU is quite useful, never questioned that!
But these bullet-point list is pure marketing bs... none of it makes sense
- Digital avatars/ auto subject framing/ eye correction require subject detection, and a finer detection of the movement of said subject (since digital avatars are not static, but animated according to the facial movement of a person). I've read a lot about that subject, computers are really bad at perceiving the world like we do, deep learning improved that a lot, and the computer need to apply that ML algorithm on the fly.
The Limits of Computer Vision, and of Our Own | Harvard Medicine Magazine
- In the same sipirit, posture correction also require the computer to recognize what's a bad posture and a good posture in the place. Wich make use of computer vision/subject recignition.
-Lighting enhancement is for video calls, it's to make up for low light, some of them are specifically making the speaker brighter, while also try to make up for the noise present in low ligh situation. So subject detection, and image reconstruction are involved.
- Auto translation can include real time audio translation wich need real time speech/language recognition, I don't think that they only mean text translation.
- Accesibility features, mostly for people who have isues with their eyesight, so you use subject recignition to describe what's happening on the screen, even when people don't make content with the description tag. Also improving speech based input, because again, computers are by default bad at recognizing sounds and pictures like we do. To this day I'm still seeing exemple of a computer doing mistakes in audio transcription.
-Blue light dimming is a just behavior based automated warm mode to reduce eye strain. Instead of just using it at a set time, it's going to do it several time a day base on your behavior. That's the theory, but I haven't seen that one implemented yet, but imo that sound annoying
- Longer battery life because there's more and more applications that are making use of on device ML algorithms, so the NPU can increase the battery life. Improved performance also because of the NPU who can let the CPU/GPU available for other stuff.
I feel there's a misconcpetion that A.I is mostly about generating stuff, when it's also extensively used to make computers "see", "hear", and "understand" things that would be a pain to do with classic programming.