News Posts matching #MI325X

Return to Keyword Browsing

ASUS Presents All-New Storage-Server Solutions to Unleash AI Potential at SC24

ASUS today announced its groundbreaking next-generation infrastructure solutions at SC24, featuring a comprehensive lineup powered by AMD and Intel, as well as liquid-cooling solutions designed to accelerate the future of AI. By continuously pushing the limits of innovation, ASUS simplifies the complexities of AI and high-performance computing (HPC) through adaptive server solutions paired with expert cooling and software-development services, tailored for the exascale era and beyond. As a total-solution provider with a distinguished history in pioneering AI supercomputing, ASUS is committed to delivering exceptional value to its customers.

Comprehensive Lineup for AI and HPC Success
To fuel enterprise digital transformation through HPC and AI-driven architecture, ASUS provides a full lineup of server systems that are powered by AMD and Intel. Startups, research institutions, large enterprises or government organizations all could find the adaptive solutions to unlock value and accelerate business agility from the big data.

GIGABYTE Showcases a Leading AI and Enterprise Portfolio at Supercomputing 2024

Giga Computing, a subsidiary of GIGABYTE and an industry leader in generative AI servers and advanced cooling technologies, shows off at SC24 how the GIGABYTE enterprise portfolio provides solutions for all applications, from cloud computing to AI to enterprise IT, including energy-efficient liquid-cooling technologies. This portfolio is made more complete by long-term collaborations with leading technology companies and emerging industry leaders, which will be showcased at GIGABYTE booth #3123 at SC24 (Nov. 19-21) in Atlanta. The booth is sectioned to put the spotlight on strategic technology collaborations, as well as direct liquid cooling partners.

The GIGABYTE booth will showcase an array of NVIDIA platforms built to keep up with the diversity of workloads and degrees of demands in applications of AI & HPC hardware. For a rack-scale AI solution using the NVIDIA GB200 NVL72 design, GIGABYTE displays how seventy-two GPUs can be in one rack with eighteen GIGABYTE servers each housing two NVIDIA Grace CPUs and four NVIDIA Blackwell GPUs. Another platform at the GIGABYTE booth is the NVIDIA HGX H200 platform. GIGABYTE exhibits both its liquid-cooling G4L3-SD1 server and an air-cooled version, G593-SD1.

ASUS Presents All-New Storage-Server Solutions to Unleash AI Potential at SC24

ASUS today announced its groundbreaking next-generation infrastructure solutions at SC24, featuring a comprehensive lineup powered by AMD and Intel, as well as liquid-cooling solutions designed to accelerate the future of AI. By continuously pushing the limits of innovation, ASUS simplifies the complexities of AI and high-performance computing (HPC) through adaptive server solutions paired with expert cooling and software-development services, tailored for the exascale era and beyond. As a total-solution provider with a distinguished history in pioneering AI supercomputing, ASUS is committed to delivering exceptional value to its customers.

Comprehensive line-up for AI and HPC success
To fuel enterprise digital transformation through HPC and AI-driven architecture, ASUS provides a full lineup of server systems that powered by AMD and Intel. Startups, research institutions, large enterprises or government organizations all could find the adaptive solutions to unlock value and accelerate business agility from the big data.

Lenovo Accelerates Business Transformation with New ThinkSystem Servers Engineered for Optimal AI and Powered by AMD

Today, Lenovo announced its industry-leading ThinkSystem infrastructure solutions powered by AMD EPYC 9005 Series processors, as well as AMD Instinct MI325X accelerators. Backed by 225 of AMD's world-record performance benchmarks, the Lenovo ThinkSystem servers deliver an unparalleled combination of AMD technology-based performance and efficiency to tackle today's most demanding edge-to-cloud workloads, including AI training, inferencing and modeling.

"Lenovo is helping organizations of all sizes and across various industries achieve AI-powered business transformations," said Vlad Rozanovich, Senior Vice President, Lenovo Infrastructure Solutions Group. "Not only do we deliver unmatched performance, we offer the right mix of solutions to change the economics of AI and give customers faster time-to-value and improved total value of ownership."

HP Launches HPE ProLiant Compute XD685 Servers Powered by 5th Gen AMD EPYC Processors and AMD Instinct MI325X Accelerators

Hewlett Packard Enterprise today announced the HPE ProLiant Compute XD685 for complex AI model training tasks, powered by 5th Gen AMD EPYC processors and AMD Instinct MI325X accelerators. The new HPE system is optimized to quickly deploy high-performing, secure and energy-efficient AI clusters for use in large language model training, natural language processing and multi-modal training.

The race is on to unlock the promise of AI and its potential to dramatically advance outcomes in workforce productivity, healthcare, climate sciences and much more. To capture this potential, AI service providers, governments and large model builders require flexible, high-performance solutions that can be brought to market quickly.

MiTAC Announces New Servers Featuring AMD EPYC 9005 Series CPUs and AMD Instinct MI325X GPUs

MiTAC Computing Technology Corporation, an industry-leading server platform design manufacturer and a subsidiary of MiTAC Holdings Corporation (TSE:3706), today announced the launch of its new high-performance servers, featuring the latest AMD EPYC 9005 Series CPUs and AMD Instinct MI325X accelerators.

"AMD is the trusted data center solutions provider of choice for leading enterprises worldwide, whether they are enabling corporate AI initiatives, building large-scale cloud deployments, or hosting critical business applications on-premises," said Ravi Kuppuswamy, senior vice president, Server Business Unit, AMD. "Our latest 5th Gen AMD EPYC CPUs provide the performance, flexibility and reliability - with compatibility across the x86 data center ecosystem - to deliver tailored solutions that meet the diverse demands of the modern data center."

ASRock Rack Unveils New Server Platforms Supporting AMD EPYC 9005 Series Processors and AMD Instinct MI325X Accelerators at AMD Advancing AI 2024

ASRock Rack Inc., a leading innovative server company, announced upgrades to its extensive lineup to support AMD EPYC 9005 Series processors. Among these updates is the introduction of the new 6U8M-TURIN2 GPU server. This advanced platform features AMD Instinct MI325X accelerators, specifically optimized for intensive enterprise AI applications, and will be showcased at AMD Advancing AI 2024.

ASRock Rack Introduce GPU Servers Powered by AMD EPYC 9005 series processors
AMD today revealed the 5th Generation AMD EPYC processors, offering a wide range of core counts (up to 192 cores), frequencies (up to 5 GHz), and expansive cache capacities. Select high-frequency processors, such as the AMD EPYC 9575F, are optimized for use as host CPUs in GPU-enabled systems. Additionally, the just launched AMD Instinct MI325X accelerators feature substantial HBM3E memory and 6 TB/s of memory bandwidth, enabling quick access and efficient handling of large datasets and complex computations.

Supermicro Introduces New Servers and GPU Accelerated Systems with AMD EPYC 9005 Series CPUs and AMD Instinct MI325X GPUs

Supermicro, Inc., a Total IT Solution Provider for AI, Cloud, Storage, and 5G/Edge, announces the launch of a new series of servers, GPU-accelerated systems, and storage servers featuring the AMD EPYC 9005 Series processors and AMD Instinct MI325X GPUs. The new H14 product line represents one of the most extensive server families in the industry, including Supermicro's Hyper systems, the Twin multi-node servers, and AI inferencing GPU systems, all available with air or liquid cooling options. The new "Zen 5" processor core architecture implements full data path AVX-512 vector instructions for CPU-based AI inference and provides 17% better instructions per cycle (IPC) than the previous 4th generation EPYC processor, enabling more performance per core.

Supermicro's new H14 family uses the latest 5th Gen AMD EPYC processors which enable up to 192 cores per CPU with up to 500 W TDP (thermal design power). Supermicro has designed new H14 systems including the Hyper and FlexTwin systems which can accommodate the higher thermal requirements. The H14 family also includes three systems for AI training and inference workloads supporting up to 10 GPUs which feature the AMD EPYC 9005 Series CPU as the host processor and two which support the AMD Instinct MI325X GPU.

AMD Launches Instinct MI325X Accelerator for AI Workloads: 256 GB HBM3E Memory and 2.6 PetaFLOPS FP8 Compute

During its "Advancing AI" conference today, AMD has updated its AI accelerator portfolio with the Instinct MI325X accelerator, designed to succeed its MI300X predecessor. Built on the CDNA 3 architecture, Instinct MI325X brings a suite of improvements over the old SKU. Now, the MI325X features 256 GB of HBM3E memory running at 6 TB/s bandwidth. The capacity memory alone is a 1.8x improvement over the old MI300 SKU, which features 192 GB of regular HBM3 memory. Providing more memory capacity is crucial as upcoming AI workloads are training models with parameter counts measured in trillions, as opposed to billions with current models we have today. When it comes to compute resources, the Instinct MI325X provides 1.3 PetaFLOPS at FP16 and 2.6 PetaFLOPS at FP8 training and inference. This represents a 1.3x improvement over the Instinct MI300.

A chip alone is worthless without a good platform, and AMD decided to make the Instinct MI325X OAM modules a drop-in replacement for the current platform designed for MI300X, as they are both pin-compatible. In systems packing eight MI325X accelerators, there are 2 TB of HBM3E memory running at 48 TB/s memory bandwidth. Such a system achieves 10.4 PetaFLOPS of FP16 and 20.8 PetaFLOPS of FP8 compute performance. The company uses NVIDIA's H200 HGX as reference claims for its performance competitiveness, where the company claims that the Instinct MI325X outperforms NVIDIA H200 HGX system by 1.3x across the board in memory bandwidth, FP16 / FP8 compute performance and 1.8x in memory capacity.
Return to Keyword Browsing
Dec 21st, 2024 20:36 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts