Friday, September 13th 2024
SK hynix Presents Upgraded AiMX Solution at AI Hardware and Edge AI Summit 2024
SK hynix unveiled an enhanced Accelerator-in-Memory based Accelerator (AiMX) card at the AI Hardware & Edge AI Summit 2024 held September 9-12 in San Jose, California. Organized annually by Kisaco Research, the summit brings together representatives from the AI and machine learning ecosystem to share industry breakthroughs and developments. This year's event focused on exploring cost and energy efficiency across the entire technology stack. Marking its fourth appearance at the summit, SK hynix highlighted how its AiM products can boost AI performance across data centers and edge devices.
Booth Highlights: Meet the Upgraded AiMX
In the AI era, high-performance memory products are vital for the smooth operation of LLMs. However, as these LLMs are trained on increasingly larger datasets and continue to expand, there is a growing need for more efficient solutions. SK hynix addresses this demand with its PIM product AiMX, an AI accelerator card that combines multiple GDDR6-AiMs to provide high bandwidth and outstanding energy efficiency. At the AI Hardware & Edge AI Summit 2024, SK hynix presented its updated 32 GB AiMX prototype which offers double the capacity of the original card featured at last year's event. To highlight the new AiMX's advanced processing capabilities in a multi-batch environment, SK hynix held a demonstration of the prototype card with the Llama 3 70B model, an open source LLM. In particular, the demonstration underlined AiMX's ability to serve as a highly effective attention accelerator in data centers.AiMX addresses the cost, performance, and power consumption challenges associated with LLMs in not only data centers, but also in edge devices and on-device AI applications. For example, when applied to mobile on-device AI applications, AiMX improves LLM speed three-fold compared to mobile DRAM while maintaining the same power consumption.
Featured Presentation: Accelerating LLM Services from Data Centers to Edge Devices
On the final day of the summit, SK hynix gave a presentation detailing how AiMX is an optimal solution for accelerating LLM services in data centers and edge devices. Euicheol Lim, research fellow and head of the Solution Advanced Technology team, shared the company's plans to develop AiM products for on-device AI based on mobile DRAM and revealed the future vision for AiM. In closing, Lim emphasized the importance of close collaboration with companies involved in developing and managing data centers and edge systems to further advance AiMX products.Looking Ahead: SK hynix's Vision for AiMX in the AI Era
The AI Hardware & Edge AI Summit 2024 provided a platform for SK hynix to demonstrate AiMX's applications in LLMs across data centers and edge devices. As a low-power, high-speed memory solution able to handle large amounts of data, AiMX is set to play a key role in the advancement of LLMs and AI applications.
Source:
SK hynix
Booth Highlights: Meet the Upgraded AiMX
In the AI era, high-performance memory products are vital for the smooth operation of LLMs. However, as these LLMs are trained on increasingly larger datasets and continue to expand, there is a growing need for more efficient solutions. SK hynix addresses this demand with its PIM product AiMX, an AI accelerator card that combines multiple GDDR6-AiMs to provide high bandwidth and outstanding energy efficiency. At the AI Hardware & Edge AI Summit 2024, SK hynix presented its updated 32 GB AiMX prototype which offers double the capacity of the original card featured at last year's event. To highlight the new AiMX's advanced processing capabilities in a multi-batch environment, SK hynix held a demonstration of the prototype card with the Llama 3 70B model, an open source LLM. In particular, the demonstration underlined AiMX's ability to serve as a highly effective attention accelerator in data centers.AiMX addresses the cost, performance, and power consumption challenges associated with LLMs in not only data centers, but also in edge devices and on-device AI applications. For example, when applied to mobile on-device AI applications, AiMX improves LLM speed three-fold compared to mobile DRAM while maintaining the same power consumption.
Featured Presentation: Accelerating LLM Services from Data Centers to Edge Devices
On the final day of the summit, SK hynix gave a presentation detailing how AiMX is an optimal solution for accelerating LLM services in data centers and edge devices. Euicheol Lim, research fellow and head of the Solution Advanced Technology team, shared the company's plans to develop AiM products for on-device AI based on mobile DRAM and revealed the future vision for AiM. In closing, Lim emphasized the importance of close collaboration with companies involved in developing and managing data centers and edge systems to further advance AiMX products.Looking Ahead: SK hynix's Vision for AiMX in the AI Era
The AI Hardware & Edge AI Summit 2024 provided a platform for SK hynix to demonstrate AiMX's applications in LLMs across data centers and edge devices. As a low-power, high-speed memory solution able to handle large amounts of data, AiMX is set to play a key role in the advancement of LLMs and AI applications.
6 Comments on SK hynix Presents Upgraded AiMX Solution at AI Hardware and Edge AI Summit 2024
Neo is a real person that lives in downtown Chicago.