• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

SK Hynix Begins Mass-production of 12-layer HBM3E Memory

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
SK hynix Inc. announced today that it has begun mass production of the world's first 12-layer HBM3E product with 36 GB, the largest capacity of existing HBM to date. The company plans to supply mass-produced products to customers within the year, proving its overwhelming technology once again six months after delivering the HBM3E 8-layer product to customers for the first time in the industry in March this year.

SK hynix is the only company in the world that has developed and supplied the entire HBM lineup from the first generation (HBM1) to the fifth generation (HBM3E), since releasing the world's first HBM in 2013. The company plans to continue its leadership in the AI memory market, addressing the growing needs of AI companies by being the first in the industry to mass-produce the 12-layer HBM3E.



According to the company, the 12-layer HBM3E product meets the world's highest standards in all areas that are essential for AI memory including speed, capacity and stability. SK hynix has increased the speed of memory operations to 9.6 Gbps, the highest memory speed available today. If Llama 3 70B, a Large Language Model (LLM), is driven by a single GPU equipped with four HBM3E products, it can read 70 billion total parameters 35 times within a second.

SK hynix has increased the capacity by 50% by stacking 12 layers of 3 GB DRAM chips at the same thickness as the previous eight-layer product. To achieve this, the company made each DRAM chip 40% thinner than before and stacked vertically using TSV technology.

The company also solved structural issues that arise from stacking thinner chips higher by applying its core technology, the Advanced MR-MUF process. This allows to provide 10% higher heat dissipation performance compared to the previous generation, and secure the stability and reliability of the product through enhanced warpage controlling.

"SK hynix has once again broken through technological limits demonstrating our industry leadership in AI memory," said Justin Kim, President (Head of AI Infra) at SK hynix. "We will continue our position as the No.1 global AI memory provider as we steadily prepare next-generation memory products to overcome the challenges of the AI era."

View at TechPowerUp Main Site
 
Top