• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PC Market Returns to Growth in Q1 2024 with AI PCs to Drive Further 2024 Expansion

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,651 (0.99/day)
Global PC shipments grew around 3% YoY in Q1 2024 after eight consecutive quarters of declines due to demand slowdown and inventory correction, according to the latest data from Counterpoint Research. The shipment growth in Q1 2024 came on a relatively low base in Q1 2023. The coming quarters of 2024 will see sequential shipment growth, resulting in 3% YoY growth for the full year, largely driven by AI PC momentum, shipment recovery across different sectors, and a fresh replacement cycle.

Lenovo's PC shipments were up 8% in Q1 2024 off an easy comparison from last year. The brand managed to reclaim its 24% share in the market, compared to 23% in Q1 2023. HP and Dell, with market shares of 21% and 16% respectively, remained flattish, waiting for North America to drive shipment growth in the coming quarters. Apple's shipment performance was also resilient, with the 2% growth mainly supported by M3 base models.




We view 2024 as the first chapter of the AI PC (including desktops and laptops), as 45% of the new laptops shipped during the year will be AI-capable. Commenting on generative AI laptops, Senior Analyst William Li said, "We believe the shipment and deployment of generative AI laptops will accelerate in 2025-2026, along with emerging generative AI functions and use cases, supported by chip vendors' new processor platforms."

Against the backdrop of normalizing inventory levels and the end of the replacement cycle following skyrocketing demand in both enterprise and consumer segments amid the COVID-19 pandemic, we believe AI laptops could act as a catalyst in driving the overall PC shipment recovery in 2024. Manufacturers are expected to start promoting AI PCs as their main products in the second half of 2024 as semiconductor companies prepare to launch SoCs featuring higher TOPS.

View at TechPowerUp Main Site | Source
 
Joined
Jan 3, 2021
Messages
3,605 (2.49/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
I still have absolutely no idea what those AI functions of "AI Capable" laptops and desktop PCs with integrated GPUs are?
They can multiply some 8-bit numbers, then spit out a report like the one you see above.
 
Joined
Sep 6, 2013
Messages
3,391 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Only inDell is down.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
They can multiply some 8-bit numbers, then spit out a report like the one you see above.
:-D

But majority of AI tools use online servers, not local training, neural networking or whatever we call it...

So we are now promoting how fast the computers, laptops are by measuring "Trillions of Operations Per Second" in their Neural Accelerators, and then just ignore all that and use servers, or maybe "accelerate" some trivial task any CPU can handle without problems?
 
Joined
Nov 8, 2017
Messages
229 (0.09/day)
:-D

But majority of AI tools use online servers, not local training, neural networking or whatever we call it...

So we are now promoting how fast the computers, laptops are by measuring "Trillions of Operations Per Second" in their Neural Accelerators, and then just ignore all that and use servers, or maybe "accelerate" some trivial task any CPU can handle without problems?
If you like inefficiency, you can keep using the CPU, yes.

"AI PC" are going to do local calculation to stuff that doesn't require that much power to begin with, and with a lower latency than online servers. Stuff like video processing, audio processing for example, where it would be ideal to not have a delay. And they are going to do so while using less power, and keeping your CPU resources intact for other things.
Those PCs will behave like this:
need burst performance? (like denoising of raw pictures) GPU+NPU, or CPU in some specific case
need to do light ML calculation over a long period of time ? NPU

1713786626296.png

The marketing material never said that the NPU was going to do everything every time. The correct hardware will be used for the correct use case
1713787777965.png

Cloud processing is more powerful, but not quick.
1713787153098.png
 
Last edited:
Joined
Aug 13, 2020
Messages
191 (0.12/day)
For the majority of muricans, when you hear AI, most people probably think ChatGPT...
 
Joined
Feb 10, 2023
Messages
282 (0.41/day)
Location
Lake Superior
Who is buying a PC for a 10-16 TOPS NPU? That's slower inference than most phones.
I am amazed that analysts think that will increase sales.

Weird.
 
Joined
May 18, 2009
Messages
2,986 (0.52/day)
Location
MN
System Name Personal / HTPC
Processor Ryzen 5900x / Ryzen 5600X3D
Motherboard Asrock x570 Phantom Gaming 4 /ASRock B550 Phantom Gaming
Cooling Corsair H100i / bequiet! Pure Rock Slim 2
Memory 32GB DDR4 3200 / 16GB DDR4 3200
Video Card(s) EVGA XC3 Ultra RTX 3080Ti / EVGA RTX 3060 XC
Storage 500GB Pro 970, 250 GB SSD, 1TB & 500GB Western Digital / lots
Display(s) Dell - S3220DGF & S3222DGM 32"
Case CoolerMaster HAF XB Evo / CM HAF XB Evo
Audio Device(s) Logitech G35 headset
Power Supply 850W SeaSonic X Series / 750W SeaSonic X Series
Mouse Logitech G502
Keyboard Black Microsoft Natural Elite Keyboard
Software Windows 10 Pro 64 / Windows 10 Pro 64
I actually don't know how they came to this conclusion, "PC Market Returns to Growth in Q1 2024 with AI PCs".

Every computer that goes out have some sort of AI marketing/ capability nowadays. So is AI the real reason why PC market is growing?
Yes and no.

Yes for those people that are dumb enough to buy into the "AI" hype.

No for those people that aren't dumb enough to buy into the "AI" hype.

In the end, at least right now, it's all just a gimmick/marketing strategy. "AI" is the hot thing right now.


To say that "AI" computers are driving the higher sales of PCs is, well, I think a lie. The market skyrocketed with all the lockdowns and corporations only saw $$$$. When things cooled and everything declined because the whole massive uptick in sales wasn't sustainable all off a sudden these corporations were in a panic because instead of seeing more record breaking sales they were seeing sales lower than before the lockdowns. It was inevitable. The market had to balance and it looks like things are finally starting to normalize, in terms of sales, not in terms of pricing. Lots of companies are trying to keep that pricing high by market manipulation through inventory and production reductions.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
:-D

But majority of AI tools use online servers, not local training, neural networking or whatever we call it...
That is changing, pretty rapidly.

Part of the reason is that only a handful of sold in 2023 or earlier had the right hardware to do local AI workloads.

The main reason is that AI is still very much in its infancy. There just were too many AI workloads capable of being offloaded to local silicon in 2022.

Remember that your high-end smartphone has been doing ML tasks for a while. Apple introduced their Neural Engine in 2017 with the iPhone X (seven years ago). Naturally all of the Apple Silicon M-powered computers have this technology and Apple has increased Neural Engine core counts periodically. I expect them to keep an eye out on what others are doing: Samsung, Qualcomm, Nvidia, AMD, Intel, and others.

And on the iPhone the number of ML workloads has increased over the years as well. AI isn't just one workload forever and ever that can be addressed with X number of AI cores.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
But aren't we just renaming all tasks that could be done and were done with CUDA, tensor cores on GPU etc into "AI"? The above mentioned audio, video processing which of course Nvidia already offers are an example.

I mean, I'm all for greater utilizing of hardware that was mostly been underutilized on PCs just because a lot of sold systems lacked any such capabilities, and programmers didn't want to implement hardware specific functions... But I don't think renaming and false advertising is the right way for the longer run.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
But aren't we just renaming all tasks that could be done and were done with CUDA, tensor cores on GPU etc into "AI"? The above mentioned audio, video processing which of course Nvidia already offers are an example.

I mean, I'm all for greater utilizing of hardware that was mostly been underutilized on PCs just because a lot of sold systems lacked any such capabilities, and programmers didn't want to implement hardware specific functions... But I don't think renaming and false advertising is the right way for the longer run.
At this point, yes, some people are probably just relabeling old workloads that were handled by existing differentiated silicon (RT cores, Tensor cores, etc.) into AI. Until the world collectively decides what AI means, the term will continue to be a catchall for pretty much any non-traditional workload.

I personally favor the term "machine learning" (ML) over "artificial intelligence" (AI) since the latter is rarely accurate.

But realistically we can't change how people are going to write about this stuff right now.

My personal guess is that at some point, there will be another ML workload that will be best handled by yet another differentiated type of silicon that doesn't exist today (in widespread consumer use) and we'll have yet another awkward period where commonly mentioned terms will be a bit muddled just as they are today with "AI".

But that's the nature of of rapidly moving situations. Remember the early days of the pandemic when it was called "novel coronavirus", before scientists designated it as SARS-CoV-2? And started calling the issue COVID-19? Similar sort of thing.

We had CPUs first, then GPUs and now NPUs. Will there be another type in the future? I wouldn't bet against it.

The terminology and its usage will eventually work itself out over time.

If marketing people want to hype up new products using the "AI" buzzword, that's sort of their job. They need to keep their products competitive in a larger market.

But the fact of the matter is that AI isn't going away. Whether or not we use the term five years from now to describe the workloads we are assigning to the term today is questionable but people will be using the term in the future.
 
Last edited:
Joined
Nov 8, 2017
Messages
229 (0.09/day)
Who is buying a PC for a 10-16 TOPS NPU? That's slower inference than most phones.
I am amazed that analysts think that will increase sales.

Weird.
Analysts are probably thinking that copilot becoming ChatGPT at home that can also put data into excel for you will be appealing for people who want to work faster. It seems that it will be able to automize a lot of tasks.
But aren't we just renaming all tasks that could be done and were done with CUDA, tensor cores on GPU etc into "AI"? The above mentioned audio, video processing which of course Nvidia already offers are an example.

I mean, I'm all for greater utilizing of hardware that was mostly been underutilized on PCs just because a lot of sold systems lacked any such capabilities, and programmers didn't want to implement hardware specific functions... But I don't think renaming and false advertising is the right way for the longer run.
Back then, those tasks were already falling into the "ML/AI" category, but the term A.I. was still vague for the average consumer. Thing is, there's a LOT of different opinion about what should be called "A.I.". For some the term should be reserved to an actual general-purpose artificial intelligence, for other any function that used to be associated with human processing (like real time recognition of a subject) can be called A.I.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
As Noyand says, there's a lot of disagreement right now about what "A.I." stands for. I personally don't like using AI to describe LLM but most tech journalists will do so. And so will most marketing people. Hell, there are a lot of gamers that are now using "AI" to describe pre-programmed NPC behavior. There's no AI involved, it's all scripted but that's one way gamers are using the term.

Obviously different people are using the term differently. That's a big cause of the confusion right now.
 
Joined
May 11, 2018
Messages
1,292 (0.53/day)
But people are now using the terms AI also for the tasks completely unrelated to machine learning, just because it is utilizing the "neural network" acceleration - but it's usually just the same as acceleration using GPU hardware akin to CUDA, tensor cores etc. So nowadays everything that can be sped up by your graphics card cores (integrated or discrete) is suddenly AI, and therefore, completely new!
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
But people are now using the terms AI also for the tasks completely unrelated to machine learning, just because it is utilizing the "neural network" acceleration - but it's usually just the same as acceleration using GPU hardware akin to CUDA, tensor cores etc. So nowadays everything that can be sped up by your graphics card cores (integrated or discrete) is suddenly AI, and therefore, completely new!
I don't like it either but that's how the term is being bandied about right now. However this is not the first time this has happened, misused terminology often happens when a new technology or situation emerges to the general public. And not just PC terminology.

At some point this will work itself out after the general public gets a better grasp on the scope and breadth of artificial intelligence.

Hell, we still have people here at TPU who don't quite grasp that ray tracing is a whole family of algorithms tackling various visual procedures and that everything in computer graphics is FAKE.

At some point more people will be more discerning about when they use the term "A.I." to describe something. But I can't predict when that will be.

It will likely happen when people have two devices side by side and see stuff the newer device can do versus the older device. It could be two systems at home. Or it might be a work device and a personal device. Or it might be your device and one belonging to a friend, family member, colleague, whatever.

We saw this when Apple introduced the Retina Display. A *LOT* of people didn't get it, even if you explained that people using logographic writing systems (China, Japan, Korea, etc.) benefitted from HiDPI displays. Once they saw these high-density displays side by side with their own eyes did they start to get it. But some still don't. Some people have poor observation skills.

After I spent a couple of years enjoying Retina Displays on iDevices I realized that my eyes would really enjoy HiDPI on my computer display. So even though my 27" 4K monitor displays with 2x scaling, it looks better and text is easier to read. It's much less about having more detail in photos, videos, or games.
 
Last edited:
Joined
Aug 23, 2013
Messages
471 (0.11/day)
I still have absolutely no idea what those AI functions of "AI Capable" laptops and desktop PCs with integrated GPUs are?

Well at this point, mostly nothing, just promises. The best example for me that I can give of how useful it can be is upscaling videos and movies.
 
Top