- Joined
- Oct 9, 2007
- Messages
- 47,555 (7.46/day)
- Location
- Dublin, Ireland
System Name | RBMK-1000 |
---|---|
Processor | AMD Ryzen 7 5700G |
Motherboard | ASUS ROG Strix B450-E Gaming |
Cooling | DeepCool Gammax L240 V2 |
Memory | 2x 8GB G.Skill Sniper X |
Video Card(s) | Palit GeForce RTX 2080 SUPER GameRock |
Storage | Western Digital Black NVMe 512GB |
Display(s) | BenQ 1440p 60 Hz 27-inch |
Case | Corsair Carbide 100R |
Audio Device(s) | ASUS SupremeFX S1220A |
Power Supply | Cooler Master MWE Gold 650W |
Mouse | ASUS ROG Strix Impact |
Keyboard | Gamdias Hermes E2 |
Software | Windows 11 Pro |
Secular growth of AI is built on the foundation of high-performance, high-bandwidth memory solutions. These high-performing memory solutions are critical to unlock the capabilities of GPUs and processors. Micron Technology, Inc., today announced it is the world's first and only memory company shipping both HBM3E and SOCAMM (small outline compression attached memory module) products for AI servers in the data center. This extends Micron's industry leadership in designing and delivering low-power DDR (LPDDR) for data center applications.
Micron's SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support the NVIDIA GB300 Grace Blackwell Ultra Superchip. The Micron HBM3E 12H 36 GB is also designed into the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms, while the HBM3E 8H 24 GB is available for the NVIDIA HGX B200 and GB200 NVL72 platforms. The deployment of Micron HBM3E products in NVIDIA Hopper and NVIDIA Blackwell systems underscores Micron's critical role in accelerating AI workloads.
Think AI, think memory, think Micron
At GTC 2025, Micron will showcase its complete AI memory and storage portfolio to fuel AI from the data center to the edge, highlighting the deep alignment between Micron and its ecosystem partners. Micron's broad portfolio includes HBM3E 8H 24 GB and HBM3E 12H 36 GB, LPDDR5X SOCAMMs, GDDR7 and high-capacity DDR5 RDIMMs and MRDIMMs. Additionally, Micron offers an industry-leading portfolio of data center SSDs and automotive and industrial products such as UFS4.1, NVMe SSDs and LPDDR5X, all of which are suited for edge compute applications.
"AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications," said Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit. "HBM and LP memory solutions help unlock improved computational capabilities for GPUs."
SOCAMM: a new standard for AI memory performance and efficiency
Micron's SOCAMM solution is now in volume production. The modular SOCAMM solution enables accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to provide high-capacity memory for increasing AI workload requirements.
Micron SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution, designed to meet the demands of AI servers and data-intensive applications. This new SOCAMM solution enables data centers to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to provide infrastructure flexibility.
Micron continues its competitive lead in the AI industry by offering 50% increased capacity over the HBM3E 8H 24 GB within the same cube form factor.5 Additionally, the HBM3E 12H 36 GB provides up to 20% lower power consumption compared to the competition's HBM3E 8H 24 GB offering, while providing 50% higher memory capacity.
By continuing to deliver exceptional power and performance metrics, Micron aims to maintain its technology momentum as a leading AI memory solutions provider through the launch of HBM4. Micron's HBM4 solution is expected to boost performance by over 50% compared to HBM3E.
Complete memory and storage solutions designed for AI from the data center to the edge
Micron also has a proven portfolio of storage products designed to meet the growing demands of AI workloads. Advancing storage technology in performance and power efficiency at the speed of light requires tight collaboration with ecosystem partners to ensure interoperability and a seamless customer experience. Micron delivers optimized SSDs for AI workloads such as: inference, training, data preparation, analytics and data lakes. Micron will be showcasing the following storage solutions at GTC:
As AI and generative AI expand and are integrated on-device at the edge, Micron is working closely with key ecosystem partners to deliver innovative solutions for AI for automotive, industrial and consumer. In addition to high performance requirements, these applications require enhanced quality, reliability and longevity requirements for application usage models.
View at TechPowerUp Main Site
Micron's SOCAMM, a modular LPDDR5X memory solution, was developed in collaboration with NVIDIA to support the NVIDIA GB300 Grace Blackwell Ultra Superchip. The Micron HBM3E 12H 36 GB is also designed into the NVIDIA HGX B300 NVL16 and GB300 NVL72 platforms, while the HBM3E 8H 24 GB is available for the NVIDIA HGX B200 and GB200 NVL72 platforms. The deployment of Micron HBM3E products in NVIDIA Hopper and NVIDIA Blackwell systems underscores Micron's critical role in accelerating AI workloads.


Think AI, think memory, think Micron
At GTC 2025, Micron will showcase its complete AI memory and storage portfolio to fuel AI from the data center to the edge, highlighting the deep alignment between Micron and its ecosystem partners. Micron's broad portfolio includes HBM3E 8H 24 GB and HBM3E 12H 36 GB, LPDDR5X SOCAMMs, GDDR7 and high-capacity DDR5 RDIMMs and MRDIMMs. Additionally, Micron offers an industry-leading portfolio of data center SSDs and automotive and industrial products such as UFS4.1, NVMe SSDs and LPDDR5X, all of which are suited for edge compute applications.
"AI is driving a paradigm shift in computing, and memory is at the heart of this evolution. Micron's contributions to the NVIDIA Grace Blackwell platform yields significant performance and power-saving benefits for AI training and inference applications," said Raj Narasimhan, senior vice president and general manager of Micron's Compute and Networking Business Unit. "HBM and LP memory solutions help unlock improved computational capabilities for GPUs."
SOCAMM: a new standard for AI memory performance and efficiency
Micron's SOCAMM solution is now in volume production. The modular SOCAMM solution enables accelerated data processing, superior performance, unmatched power efficiency and enhanced serviceability to provide high-capacity memory for increasing AI workload requirements.
Micron SOCAMM is the world's fastest, smallest, lowest-power and highest capacity modular memory solution, designed to meet the demands of AI servers and data-intensive applications. This new SOCAMM solution enables data centers to get the same compute capacity with better bandwidth, improved power consumption and scaling capabilities to provide infrastructure flexibility.
- Fastest: SOCAMMs provide over 2.5 times higher bandwidth at the same capacity when compared to RDIMMs, allowing faster access to larger training datasets and more complex models, as well as increasing throughput for inference workloads.2
- Smallest: At 14x90mm, the innovative SOCAMM form factor occupies one-third of the size of the industry-standard RDIMM form factor, enabling compact, efficient server design.
- Lowest power: Leveraging LPDDR5X memory, SOCAMM products consume one-third the power compared to standard DDR5 RDIMMs, inflecting the power performance curve in AI architectures.
- Highest capacity: SOCAMM solutions use four placements of 16-die stacks of LPDDR5X memory to enable a 128 GB memory module, offering the highest capacity LPDDR5X memory solution, which is essential for advancements towards faster AI model training and increased concurrent users for inference workloads.
- Optimized scalability and serviceability: SOCAMM's modular design and innovative stacking technology improve serviceability and aid the design of liquid-cooled servers. The enhanced error correction feature in Micron's LPDDR5X with data center-focused test flows, provides an optimized memory solution designed for the data center.
Micron continues its competitive lead in the AI industry by offering 50% increased capacity over the HBM3E 8H 24 GB within the same cube form factor.5 Additionally, the HBM3E 12H 36 GB provides up to 20% lower power consumption compared to the competition's HBM3E 8H 24 GB offering, while providing 50% higher memory capacity.
By continuing to deliver exceptional power and performance metrics, Micron aims to maintain its technology momentum as a leading AI memory solutions provider through the launch of HBM4. Micron's HBM4 solution is expected to boost performance by over 50% compared to HBM3E.
Complete memory and storage solutions designed for AI from the data center to the edge
Micron also has a proven portfolio of storage products designed to meet the growing demands of AI workloads. Advancing storage technology in performance and power efficiency at the speed of light requires tight collaboration with ecosystem partners to ensure interoperability and a seamless customer experience. Micron delivers optimized SSDs for AI workloads such as: inference, training, data preparation, analytics and data lakes. Micron will be showcasing the following storage solutions at GTC:
- High-performance Micron 9550 NVMe and Micron 7450 NVMe SSDs included on the GB200 NVL72 recommended vendor list.
- Micron's PCIe Gen 6 SSD, demonstrating over 27 GB/s of bandwidth in successful interoperability testing with leading PCIe switch and retimer vendors, driving the industry to this new generation of flash storage.
- Storing more data in less space is essential to get the most out of AI data centers. The Micron 61.44 TB 6550 ION NVMe SSD is the drive of choice for bleeding-edge AI cluster exascale storage solutions, by delivering over 44 petabytes of storage per rack, 14 GB/s and 2 million IOPS per drive inside a 20-watt footprint.
As AI and generative AI expand and are integrated on-device at the edge, Micron is working closely with key ecosystem partners to deliver innovative solutions for AI for automotive, industrial and consumer. In addition to high performance requirements, these applications require enhanced quality, reliability and longevity requirements for application usage models.
- One example of this type of ecosystem collaboration is the integration of Micron LPDDR5X on the NVIDIA DRIVE AGX Orin platform. This combined solution provides increased processing performance and bandwidth while also reducing power consumption.
- By utilizing Micron's 1β (1-beta) DRAM node, LPDDR5X memory meets automotive and industrial requirements and offers higher speeds up to 9.6 Gbps and increased capacities from 32 Gb to 128 Gb to support higher bandwidth.
- Additionally, Micron LPDDR5X automotive products support operating environments from -40 degrees Celsius up to 125 degrees Celsius to provide a wide temperature range that meets automotive quality and standards.
View at TechPowerUp Main Site