Wednesday, October 25th 2023
Micron Announces Sampling of 9.6 Gbps LPDDR5X Memory
Micron Technology, Inc., announced today that it is now shipping production samples of its low-power double data rate 5X (LPDDR5X) memory - the industry's only 1β (1-beta) mobile-optimized memory - for use with Qualcomm Technologies, Inc.'s latest flagship mobile platform, Snapdragon 8 Gen 3. Running at the world's fastest speed grade of 9.6 gigabits per second (Gbps), Micron LPDDR5X provides the mobile ecosystem with the fast performance needed to unlock generative artificial intelligence (AI) at the edge. Enabled by its innovative, industry-leading 1β process node technology, Micron LPDDR5X also delivers advanced power-saving capabilities for mobile users.
"Generative AI is poised to unleash unprecedented productivity, ease of use, and personalization for smartphone users by delivering the power of large language models to flagship mobile phones," said Mark Montierth, corporate vice president and general manager of Micron's Mobile Business Unit. "Micron's 1β LPDDR5X combined with Qualcomm Technologies' AI-optimized Snapdragon 8 Gen 3 Mobile Platform empowers smartphone manufacturers with the next-generation performance and power efficiency essential to enabling revolutionary AI technology at the edge."As the industry's fastest mobile memory offered in speed grades up to 9.6 Gbps, Micron's LPDDR5X provides over 12% higher peak bandwidth compared to the previous generation - critical for enabling AI at the edge. The Snapdragon 8 Gen 3 allows powerful generative AI models to run locally on flagship smartphones, unlocking a new generation of AI-based applications and capabilities. Enabling on-device AI additionally improves network efficiency and reduces the energy requirements and expense of more costly cloud-based solutions, which require back-and-forth data transfer to and from remote servers.
"To date, powerful generative AI has mostly been executed in the cloud, but our new Snapdragon 8 Gen 3 brings revolutionary generative AI use cases to users' fingertips by enabling large language models and large vision models to run on the device," said Ziad Asghar, senior vice president of product management at Qualcomm Technologies, Inc. "Our collaboration with Micron to pair the industry's fastest mobile memory, its 1β LPDDR5X, with our latest Snapdragon mobile platform opens up a new world of on-device, ultra-personalized AI experiences for smartphone users."
Built on Micron's industry-leading 1β process node and delivering the industry's most advanced power-saving capabilities such as enhanced dynamic voltage and frequency scaling core techniques, LPDDR5X offers a nearly 30% power improvement and the flexibility to deliver workload-customized power and performance. These power savings are especially crucial for energy-intensive, AI-fueled applications, enabling users to reap the benefits of generative AI with prolonged battery life.
Offered in capacities up to 16 gigabytes and providing the industry's highest performance and lowest power consumption, Micron's LPDDR5X delivers unprecedented support for on-device AI, accelerating generative AI's capabilities at the edge.
Source:
Micron
"Generative AI is poised to unleash unprecedented productivity, ease of use, and personalization for smartphone users by delivering the power of large language models to flagship mobile phones," said Mark Montierth, corporate vice president and general manager of Micron's Mobile Business Unit. "Micron's 1β LPDDR5X combined with Qualcomm Technologies' AI-optimized Snapdragon 8 Gen 3 Mobile Platform empowers smartphone manufacturers with the next-generation performance and power efficiency essential to enabling revolutionary AI technology at the edge."As the industry's fastest mobile memory offered in speed grades up to 9.6 Gbps, Micron's LPDDR5X provides over 12% higher peak bandwidth compared to the previous generation - critical for enabling AI at the edge. The Snapdragon 8 Gen 3 allows powerful generative AI models to run locally on flagship smartphones, unlocking a new generation of AI-based applications and capabilities. Enabling on-device AI additionally improves network efficiency and reduces the energy requirements and expense of more costly cloud-based solutions, which require back-and-forth data transfer to and from remote servers.
"To date, powerful generative AI has mostly been executed in the cloud, but our new Snapdragon 8 Gen 3 brings revolutionary generative AI use cases to users' fingertips by enabling large language models and large vision models to run on the device," said Ziad Asghar, senior vice president of product management at Qualcomm Technologies, Inc. "Our collaboration with Micron to pair the industry's fastest mobile memory, its 1β LPDDR5X, with our latest Snapdragon mobile platform opens up a new world of on-device, ultra-personalized AI experiences for smartphone users."
Built on Micron's industry-leading 1β process node and delivering the industry's most advanced power-saving capabilities such as enhanced dynamic voltage and frequency scaling core techniques, LPDDR5X offers a nearly 30% power improvement and the flexibility to deliver workload-customized power and performance. These power savings are especially crucial for energy-intensive, AI-fueled applications, enabling users to reap the benefits of generative AI with prolonged battery life.
Offered in capacities up to 16 gigabytes and providing the industry's highest performance and lowest power consumption, Micron's LPDDR5X delivers unprecedented support for on-device AI, accelerating generative AI's capabilities at the edge.
2 Comments on Micron Announces Sampling of 9.6 Gbps LPDDR5X Memory