Saturday, June 17th 2023
Insider Info Alleges SK hynix Preparing HBM3E Samples for NVIDIA
Industry insiders in South Korea have informed news publications that NVIDIA has requested that SK hynix submit samples of next-generation high bandwidth memory (HBM) for evaluation purposes—according to Business Korea's article, workers were preparing an initial batch of HBM3E prototypes for shipment this week. SK hynix has an existing relationship with NVIDIA—it fended off tough competition last year and has since produced (current gen) HBM3 DRAM for the H100 "Hopper" Tensor Core GPU.
The memory manufacturer is hoping to maintain its position as the HBM market leader with fifth generation products in the pipeline—vice president Park Myung-soo revealed back in April that: "we are preparing 8 Gbps HBM3E product samples for the second half of this year and are preparing for mass production in the first half of next year." A new partnership with NVIDIA could help SK hynix widen the gulf between it and and its nearest competitor - Samsung - in the field of HBM production.
Sources:
Business Korea, Digitimes Asia
The memory manufacturer is hoping to maintain its position as the HBM market leader with fifth generation products in the pipeline—vice president Park Myung-soo revealed back in April that: "we are preparing 8 Gbps HBM3E product samples for the second half of this year and are preparing for mass production in the first half of next year." A new partnership with NVIDIA could help SK hynix widen the gulf between it and and its nearest competitor - Samsung - in the field of HBM production.
15 Comments on Insider Info Alleges SK hynix Preparing HBM3E Samples for NVIDIA
If I'm not mistaken there was a Samsung slide showing plans to reach that speed only in 2025-2026...
8GT vs 3.2GT transfer rate.
www.tomshardware.com/news/sk-hynix-preps-hbm3e-memory
SK Hynix has the fastest HBM on the market & Samsung's already at 5GB/s so they should be higher.
Wccftech's comparison should be between the first (worst) version of HBM3.
"Samsung & SK Hynix GPU DRAM Prices Shoot Up: HBM3 5x More Expensive As Demand Grows For NVIDIA GPUs In ChatGPT"
wccftech.com/samsung-sk-hynix-gpu-dram-prices-shoot-up-hbm3-5x-more-expensive-as-demand-grows-for-nvidia-gpus-in-chatgpt/
'nuff said :D
These are simply for the £200k+ 8 GPU servers used for AI workloads.