Monday, April 22nd 2024

PC Market Returns to Growth in Q1 2024 with AI PCs to Drive Further 2024 Expansion

Global PC shipments grew around 3% YoY in Q1 2024 after eight consecutive quarters of declines due to demand slowdown and inventory correction, according to the latest data from Counterpoint Research. The shipment growth in Q1 2024 came on a relatively low base in Q1 2023. The coming quarters of 2024 will see sequential shipment growth, resulting in 3% YoY growth for the full year, largely driven by AI PC momentum, shipment recovery across different sectors, and a fresh replacement cycle.

Lenovo's PC shipments were up 8% in Q1 2024 off an easy comparison from last year. The brand managed to reclaim its 24% share in the market, compared to 23% in Q1 2023. HP and Dell, with market shares of 21% and 16% respectively, remained flattish, waiting for North America to drive shipment growth in the coming quarters. Apple's shipment performance was also resilient, with the 2% growth mainly supported by M3 base models.
We view 2024 as the first chapter of the AI PC (including desktops and laptops), as 45% of the new laptops shipped during the year will be AI-capable. Commenting on generative AI laptops, Senior Analyst William Li said, "We believe the shipment and deployment of generative AI laptops will accelerate in 2025-2026, along with emerging generative AI functions and use cases, supported by chip vendors' new processor platforms."

Against the backdrop of normalizing inventory levels and the end of the replacement cycle following skyrocketing demand in both enterprise and consumer segments amid the COVID-19 pandemic, we believe AI laptops could act as a catalyst in driving the overall PC shipment recovery in 2024. Manufacturers are expected to start promoting AI PCs as their main products in the second half of 2024 as semiconductor companies prepare to launch SoCs featuring higher TOPS.
Source: Counterpoint Research
Add your own comment

17 Comments on PC Market Returns to Growth in Q1 2024 with AI PCs to Drive Further 2024 Expansion

#1
Bwaze
I still have absolutely no idea what those AI functions of "AI Capable" laptops and desktop PCs with integrated GPUs are?
Posted on Reply
#2
Wirko
BwazeI still have absolutely no idea what those AI functions of "AI Capable" laptops and desktop PCs with integrated GPUs are?
They can multiply some 8-bit numbers, then spit out a report like the one you see above.
Posted on Reply
#3
john_
Only inDell is down.
Posted on Reply
#4
Bwaze
WirkoThey can multiply some 8-bit numbers, then spit out a report like the one you see above.
:-D

But majority of AI tools use online servers, not local training, neural networking or whatever we call it...

So we are now promoting how fast the computers, laptops are by measuring "Trillions of Operations Per Second" in their Neural Accelerators, and then just ignore all that and use servers, or maybe "accelerate" some trivial task any CPU can handle without problems?
Posted on Reply
#6
Noyand
Bwaze:-D

But majority of AI tools use online servers, not local training, neural networking or whatever we call it...

So we are now promoting how fast the computers, laptops are by measuring "Trillions of Operations Per Second" in their Neural Accelerators, and then just ignore all that and use servers, or maybe "accelerate" some trivial task any CPU can handle without problems?
If you like inefficiency, you can keep using the CPU, yes.

"AI PC" are going to do local calculation to stuff that doesn't require that much power to begin with, and with a lower latency than online servers. Stuff like video processing, audio processing for example, where it would be ideal to not have a delay. And they are going to do so while using less power, and keeping your CPU resources intact for other things.
Those PCs will behave like this:
need burst performance? (like denoising of raw pictures) GPU+NPU, or CPU in some specific case
need to do light ML calculation over a long period of time ? NPU


The marketing material never said that the NPU was going to do everything every time. The correct hardware will be used for the correct use case

Cloud processing is more powerful, but not quick.
Posted on Reply
#7
Philaphlous
For the majority of muricans, when you hear AI, most people probably think ChatGPT...
Posted on Reply
#8
JohH
Who is buying a PC for a 10-16 TOPS NPU? That's slower inference than most phones.
I am amazed that analysts think that will increase sales.

Weird.
Posted on Reply
#9
neatfeatguy
watzupkenI actually don't know how they came to this conclusion, "PC Market Returns to Growth in Q1 2024 with AI PCs".

Every computer that goes out have some sort of AI marketing/ capability nowadays. So is AI the real reason why PC market is growing?
Yes and no.

Yes for those people that are dumb enough to buy into the "AI" hype.

No for those people that aren't dumb enough to buy into the "AI" hype.

In the end, at least right now, it's all just a gimmick/marketing strategy. "AI" is the hot thing right now.


To say that "AI" computers are driving the higher sales of PCs is, well, I think a lie. The market skyrocketed with all the lockdowns and corporations only saw $$$$. When things cooled and everything declined because the whole massive uptick in sales wasn't sustainable all off a sudden these corporations were in a panic because instead of seeing more record breaking sales they were seeing sales lower than before the lockdowns. It was inevitable. The market had to balance and it looks like things are finally starting to normalize, in terms of sales, not in terms of pricing. Lots of companies are trying to keep that pricing high by market manipulation through inventory and production reductions.
Posted on Reply
#10
cvaldes
Bwaze:-D

But majority of AI tools use online servers, not local training, neural networking or whatever we call it...
That is changing, pretty rapidly.

Part of the reason is that only a handful of sold in 2023 or earlier had the right hardware to do local AI workloads.

The main reason is that AI is still very much in its infancy. There just were too many AI workloads capable of being offloaded to local silicon in 2022.

Remember that your high-end smartphone has been doing ML tasks for a while. Apple introduced their Neural Engine in 2017 with the iPhone X (seven years ago). Naturally all of the Apple Silicon M-powered computers have this technology and Apple has increased Neural Engine core counts periodically. I expect them to keep an eye out on what others are doing: Samsung, Qualcomm, Nvidia, AMD, Intel, and others.

And on the iPhone the number of ML workloads has increased over the years as well. AI isn't just one workload forever and ever that can be addressed with X number of AI cores.
Posted on Reply
#11
Bwaze
But aren't we just renaming all tasks that could be done and were done with CUDA, tensor cores on GPU etc into "AI"? The above mentioned audio, video processing which of course Nvidia already offers are an example.

I mean, I'm all for greater utilizing of hardware that was mostly been underutilized on PCs just because a lot of sold systems lacked any such capabilities, and programmers didn't want to implement hardware specific functions... But I don't think renaming and false advertising is the right way for the longer run.
Posted on Reply
#12
cvaldes
BwazeBut aren't we just renaming all tasks that could be done and were done with CUDA, tensor cores on GPU etc into "AI"? The above mentioned audio, video processing which of course Nvidia already offers are an example.

I mean, I'm all for greater utilizing of hardware that was mostly been underutilized on PCs just because a lot of sold systems lacked any such capabilities, and programmers didn't want to implement hardware specific functions... But I don't think renaming and false advertising is the right way for the longer run.
At this point, yes, some people are probably just relabeling old workloads that were handled by existing differentiated silicon (RT cores, Tensor cores, etc.) into AI. Until the world collectively decides what AI means, the term will continue to be a catchall for pretty much any non-traditional workload.

I personally favor the term "machine learning" (ML) over "artificial intelligence" (AI) since the latter is rarely accurate.

But realistically we can't change how people are going to write about this stuff right now.

My personal guess is that at some point, there will be another ML workload that will be best handled by yet another differentiated type of silicon that doesn't exist today (in widespread consumer use) and we'll have yet another awkward period where commonly mentioned terms will be a bit muddled just as they are today with "AI".

But that's the nature of of rapidly moving situations. Remember the early days of the pandemic when it was called "novel coronavirus", before scientists designated it as SARS-CoV-2? And started calling the issue COVID-19? Similar sort of thing.

We had CPUs first, then GPUs and now NPUs. Will there be another type in the future? I wouldn't bet against it.

The terminology and its usage will eventually work itself out over time.

If marketing people want to hype up new products using the "AI" buzzword, that's sort of their job. They need to keep their products competitive in a larger market.

But the fact of the matter is that AI isn't going away. Whether or not we use the term five years from now to describe the workloads we are assigning to the term today is questionable but people will be using the term in the future.
Posted on Reply
#13
Noyand
JohHWho is buying a PC for a 10-16 TOPS NPU? That's slower inference than most phones.
I am amazed that analysts think that will increase sales.

Weird.
Analysts are probably thinking that copilot becoming ChatGPT at home that can also put data into excel for you will be appealing for people who want to work faster. It seems that it will be able to automize a lot of tasks.
BwazeBut aren't we just renaming all tasks that could be done and were done with CUDA, tensor cores on GPU etc into "AI"? The above mentioned audio, video processing which of course Nvidia already offers are an example.

I mean, I'm all for greater utilizing of hardware that was mostly been underutilized on PCs just because a lot of sold systems lacked any such capabilities, and programmers didn't want to implement hardware specific functions... But I don't think renaming and false advertising is the right way for the longer run.
Back then, those tasks were already falling into the "ML/AI" category, but the term A.I. was still vague for the average consumer. Thing is, there's a LOT of different opinion about what should be called "A.I.". For some the term should be reserved to an actual general-purpose artificial intelligence, for other any function that used to be associated with human processing (like real time recognition of a subject) can be called A.I.
Posted on Reply
#14
cvaldes
As Noyand says, there's a lot of disagreement right now about what "A.I." stands for. I personally don't like using AI to describe LLM but most tech journalists will do so. And so will most marketing people. Hell, there are a lot of gamers that are now using "AI" to describe pre-programmed NPC behavior. There's no AI involved, it's all scripted but that's one way gamers are using the term.

Obviously different people are using the term differently. That's a big cause of the confusion right now.
Posted on Reply
#15
Bwaze
But people are now using the terms AI also for the tasks completely unrelated to machine learning, just because it is utilizing the "neural network" acceleration - but it's usually just the same as acceleration using GPU hardware akin to CUDA, tensor cores etc. So nowadays everything that can be sped up by your graphics card cores (integrated or discrete) is suddenly AI, and therefore, completely new!
Posted on Reply
#16
cvaldes
BwazeBut people are now using the terms AI also for the tasks completely unrelated to machine learning, just because it is utilizing the "neural network" acceleration - but it's usually just the same as acceleration using GPU hardware akin to CUDA, tensor cores etc. So nowadays everything that can be sped up by your graphics card cores (integrated or discrete) is suddenly AI, and therefore, completely new!
I don't like it either but that's how the term is being bandied about right now. However this is not the first time this has happened, misused terminology often happens when a new technology or situation emerges to the general public. And not just PC terminology.

At some point this will work itself out after the general public gets a better grasp on the scope and breadth of artificial intelligence.

Hell, we still have people here at TPU who don't quite grasp that ray tracing is a whole family of algorithms tackling various visual procedures and that everything in computer graphics is FAKE.

At some point more people will be more discerning about when they use the term "A.I." to describe something. But I can't predict when that will be.

It will likely happen when people have two devices side by side and see stuff the newer device can do versus the older device. It could be two systems at home. Or it might be a work device and a personal device. Or it might be your device and one belonging to a friend, family member, colleague, whatever.

We saw this when Apple introduced the Retina Display. A *LOT* of people didn't get it, even if you explained that people using logographic writing systems (China, Japan, Korea, etc.) benefitted from HiDPI displays. Once they saw these high-density displays side by side with their own eyes did they start to get it. But some still don't. Some people have poor observation skills.

After I spent a couple of years enjoying Retina Displays on iDevices I realized that my eyes would really enjoy HiDPI on my computer display. So even though my 27" 4K monitor displays with 2x scaling, it looks better and text is easier to read. It's much less about having more detail in photos, videos, or games.
Posted on Reply
#17
Mysteoa
BwazeI still have absolutely no idea what those AI functions of "AI Capable" laptops and desktop PCs with integrated GPUs are?
Well at this point, mostly nothing, just promises. The best example for me that I can give of how useful it can be is upscaling videos and movies.
Posted on Reply
Add your own comment
Dec 19th, 2024 03:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts