Monday, November 4th 2024
SK hynix Introduces World's First 16-High HBM3E at SK AI Summit 2024
SK hynix CEO Kwak Noh-Jung, during his keynote speech titled "A New Journey in Next-Generation AI Memory: Beyond Hardware to Daily Life" at SK AI Summit in Seoul, made public development of the industry's first 48 GB 16-high - the world's highest number of layers followed by the 12-high product—HBM3E. Kwak also shared the company's vision to become a "Full Stack AI Memory Provider", or a provider with a full lineup of AI memory products in both DRAM and NAND spaces, through close collaboration with interested parties.Summary of CEO Kwak Noh-Jung's comments
- Memory, which existed on the "personal" level in the past with data stored in personal PC or smartphones, has advanced to the "connected" level now through cloud services and social media channels
- In future, with further development in AI, memory will exist in an expanded scope of "creativity and experience"
- The "Creative Memory" that SK hynix envisions cannot come into reality without next-generation semiconductor memory that supports strong computing power
- SK hynix has been preparing for various "World First" products by being the first in the industry to develop and start volume shipping
- Company also planning on products that are "Beyond Best" with industry-top competitiveness and with "Optimal Innovation" for systems in AI era
- Market for 16-high HBM expected to open up from HBM4 generation, but SK hynix has been developing 48 GB 16-high HBM3E in a bid to secure technological stability and plans to provide samples to customers early next year
- SK hynix to apply Advanced MR-MUF process, which enabled mass production of 12-high products, to produce 16-high HBM3E, while also developing hybrid bonding technology as a backup
- 16-high products come with performance improvement of 18% in training, 32% in inference vs 12-high products. With market for AI accelerators for inference expected to expand, 16-high products forecast to help company solidify its leadership in AI memory in future
- SK hynix also developing LPCAMM2 module for PC and data center, 1cnm-based LPDDR5 and LPDDR6, taking full advantage of its competitiveness in low-power and high-performance products
- Company also readying PCIe 6th generation SSD, high-capacity QLC-based eSSD and UFS 5.0
- SK hynix plans to adopt logic process on base die from HBM4 generation through collaboration with a top global logic foundry to provide customers with best products
- Customized HBM will be product with optimized performance that reflects various customer demands for capacity, bandwidth and function and is expected to pave the way for new paradigm in AI memory
- As powering AI system requires a sharp increase in capacity of memory installed in servers, SK hynix preparing CXL Fabrics that enables high capacity through connection of various memories, while developing eSSD with ultra-high capacity to allow more data in a smaller space at low power
- SK hynix also developing technology that adds computational functions to memory to overcome so-called memory wall. Technologies such as Processing near Memory (PNM), Processing in Memory (PIM), Computational Storage, essential to process enormous amount of data in future, will be a challenge that transforms structure of next-generation AI system and a future of AI industry
Comments on SK hynix Introduces World's First 16-High HBM3E at SK AI Summit 2024
There are no comments yet.