• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

HBM Supply Leader SK Hynix's Market Share to Exceed 50% in 2023 Due to Demand for AI Servers

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
17,841 (2.42/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
A strong growth in AI server shipments has driven demand for high bandwidth memory (HBM). TrendForce reports that the top three HBM suppliers in 2022 were SK hynix, Samsung, and Micron, with 50%, 40%, and 10% market share, respectively. Furthermore, the specifications of high-end AI GPUs designed for deep learning have led to HBM product iteration. To prepare for the launch of NVIDIA H100 and AMD MI300 in 2H23, all three major suppliers are planning for the mass production of HBM3 products. At present, SK hynix is the only supplier that mass produces HBM3 products, and as a result, is projected to increase its market share to 53% as more customers adopt HBM3. Samsung and Micron are expected to start mass production sometime towards the end of this year or early 2024, with HBM market shares of 38% and 9%, respectively.

AI server shipment volume expected to increase by 15.4% in 2023
NVIDIA's DM/ML AI servers are equipped with an average of four or eight high-end graphics cards and two mainstream x86 server CPUs. These servers are primarily used by top US cloud services providers such as Google, AWS, Meta, and Microsoft. TrendForce analysis indicates that the shipment volume of servers with high-end GPGPUs is expected to increase by around 9% in 2022, with approximately 80% of these shipments concentrated in eight major cloud service providers in China and the US. Looking ahead to 2023, Microsoft, Meta, Baidu, and ByteDance will launch generative AI products and services, further boosting AI server shipments. It is estimated that the shipment volume of AI servers will increase by 15.4% this year, and a 12.2% CAGR for AI server shipments is projected from 2023 to 2027.




AI servers stimulate a simultaneous increase in demand for server DRAM, SSD, and HBM
TrendForce points out that the rise of AI servers is likely to increase demand for memory usage. While general servers have 500-600 GB of server DRAM, AI servers require significantly more—averaging between 1.2-1.7 TB with 64-128 GB per module. For enterprise SSDs, priority is given to DRAM or HBM due to the high-speed requirements of AI servers, but there has yet to be a noticeable push to expand SSD capacity. However, in terms of interface, PCIe 5.0 is more favored when it comes to addressing high-speed computing needs. Additionally, AI servers tend to use GPGPUs, and with NVIDIA A100 80 GB configurations of four or eight, HBM usage would be around 320-640 GB. As AI models grow increasingly complex, demand for server DRAM, SSDs, and HBM will grow simultaneously.



View at TechPowerUp Main Site | Source
 
Joined
Jan 18, 2020
Messages
851 (0.47/day)
How is "AI" (Machine learning) being monetized ?

Chat bots are free ? Or they'll replace customer service and other roles that are script based is large companies anyway?

Dotcom 3 ?
 
Joined
Oct 15, 2004
Messages
189 (0.03/day)
Location
Peterborough, UK
System Name IONE
Processor AMD Ryzen 9 5900X
Motherboard ASUS STRIX B550-A Gaming
Cooling Noctua NH-U12S SE-AM4
Memory 128GB (4x32GB) Corsair DDR4 Vengeance LPX Black, PC4-25600 (3200), CMK128GX4M4E3200C16
Video Card(s) PNY GeForce RTX 3080 12GB
Storage Samsung 980 1TB NVMe (system), Lexar NM790 4TB NVMe (temp), 16x Seagate IronWolf 10TB RAID6
Display(s) Dell UP3017
Case Lian-Li PC-777B
Audio Device(s) Focal Alpha 65 Evo
Power Supply Corsair AX1200
Mouse Logitech M510
Keyboard Keychron Q10, brass plate, Kailh Box Summer switches and PBT Cherry keycaps
Software Xubuntu 24.04
Benchmark Scores N/A
How is "AI" (Machine learning) being monetized ?

Chat bots are free ? Or they'll replace customer service and other roles that are script based is large companies anyway?

Dotcom 3 ?
Machine learning is popular to profile users for better targeted advertising for example.

Then on top of that we the pure internal use cases where companies can use machine learning to optimise and replace parts of their workforce. It is only a matter of time before call centers will be gone for example.
 
Joined
Apr 12, 2013
Messages
7,570 (1.77/day)
It's used everywhere including the always greedy AF "investment" banks!
 
Joined
Jan 18, 2020
Messages
851 (0.47/day)
Used for what?

Say I invest $10bn in fancy "AI" hardware, what's it going to do to make me more money than I could do with pre existing technology for 1/1000th of the cost ?
 
Joined
Mar 18, 2023
Messages
948 (1.44/day)
System Name Never trust a socket with less than 2000 pins
Used for what?

Say I invest $10bn in fancy "AI" hardware, what's it going to do to make me more money than I could do with pre existing technology for 1/1000th of the cost ?

If you are Google or somesuch, can you risk not doing lots of AI research and let the competitors gets ahead?

Doesn't have to make money yet.
 
Joined
Nov 6, 2016
Messages
1,784 (0.60/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Hey, why not, right? We're apparently going through another gilded age of capitalism with severe concentration, defacto monopolies, and cartelism. There's only a handful of NAND/memory manufacturers and they already manipulate the market, so why not include HBM in that, right?
 
Joined
Jun 1, 2021
Messages
310 (0.24/day)
How is "AI" (Machine learning) being monetized ?

Chat bots are free ? Or they'll replace customer service and other roles that are script based is large companies anyway?

Dotcom 3 ?

Well, it depends a lot on what the specific 'AI' even is. As it is actually a bunch of techniques that can be used for a lot of purposes, the thing about it is that it's an 'easy' solution, specially when you don't have much mathematical knowledge on the specific problem.

As an example of what it can be used for is to replace PIDs controller in industrial settings, but instead of having humans manually adjusting the PID parameters, the 'AI' is conditioned(trained) to match the desired output that the factory operators desire.

So basically the 'AI' is just a mathematical model that has it's parameters modified in order to match the desired data distribution. But just that alone has a shit ton of application, just stuff that you won't see at all because it's not front-facing application.
 
Joined
Mar 2, 2021
Messages
51 (0.04/day)
How is "AI" (Machine learning) being monetized ?

Chat bots are free ? Or they'll replace customer service and other roles that are script based is large companies anyway?

Dotcom 3 ?

It's token based for gpt at least. The chatbots are basically just the trial version / proof of concept. The model is to sell access to businesses that think they can make a cool product using gpt in "something".


Lists:
ModelPromptCompletion
8K context$0.03 / 1K tokens$0.06 / 1K tokens
32K context$0.06 / 1K tokens$0.12 / 1K tokens
 
Top