- Joined
- May 21, 2024
- Messages
- 718 (3.40/day)
AMD today announced that Oracle Cloud Infrastructure (OCI) has chosen AMD Instinct MI300X accelerators with ROCm open software to power its newest OCI Compute Supercluster instance called BM.GPU.MI300X.8. For AI models that can comprise hundreds of billions of parameters, the OCI Supercluster with AMD MI300X supports up to 16,384 GPUs in a single cluster by harnessing the same ultrafast network fabric technology used by other accelerators on OCI. Designed to run demanding AI workloads including large language model (LLM) inference and training that requires high throughput with leading memory capacity and bandwidth, these OCI bare metal instances have already been adopted by companies including Fireworks AI.
"AMD Instinct MI300X and ROCm open software continue to gain momentum as trusted solutions for powering the most critical OCI AI workloads," said Andrew Dieckmann, corporate vice president and general manager, Data Center GPU Business, AMD. "As these solutions expand further into growing AI-intensive markets, the combination will benefit OCI customers with high performance, efficiency, and greater system design flexibility."
"The inference capabilities of AMD Instinct MI300X accelerators add to OCI's extensive selection of high-performance bare metal instances to remove the overhead of virtualized compute commonly used for AI infrastructure," said Donald Lu, senior vice president, software development, Oracle Cloud Infrastructure. "We are excited to offer more choice for customers seeking to accelerate AI workloads at a competitive price point."
Bringing Trusted Performance and Open Choice for AI Training and Inference
The AMD Instinct MI300X underwent extensive testing which was validated by OCI that underscored its AI inferencing and training capabilities for serving latency-optimal use cases, even with larger batch sizes, and the ability to fit the largest LLM models in a single node. These Instinct MI300X performance results have garnered the attention of AI model developers.
Fireworks AI offers a fast platform designed to build and deploy generative AI. With over 100+ models, Fireworks AI is leveraging the benefits of performance found in OCI using AMD Instinct MI300X.
"Fireworks AI helps enterprises build and deploy compound AI systems across a wide range of industries and use cases," said Lin Qiao, CEO of Fireworks AI. "The amount of memory capacity available on the AMD Instinct MI300X and ROCm open software allows us to scale services to our customers as models continue to grow."
View at TechPowerUp Main Site | Source
"AMD Instinct MI300X and ROCm open software continue to gain momentum as trusted solutions for powering the most critical OCI AI workloads," said Andrew Dieckmann, corporate vice president and general manager, Data Center GPU Business, AMD. "As these solutions expand further into growing AI-intensive markets, the combination will benefit OCI customers with high performance, efficiency, and greater system design flexibility."
"The inference capabilities of AMD Instinct MI300X accelerators add to OCI's extensive selection of high-performance bare metal instances to remove the overhead of virtualized compute commonly used for AI infrastructure," said Donald Lu, senior vice president, software development, Oracle Cloud Infrastructure. "We are excited to offer more choice for customers seeking to accelerate AI workloads at a competitive price point."
Bringing Trusted Performance and Open Choice for AI Training and Inference
The AMD Instinct MI300X underwent extensive testing which was validated by OCI that underscored its AI inferencing and training capabilities for serving latency-optimal use cases, even with larger batch sizes, and the ability to fit the largest LLM models in a single node. These Instinct MI300X performance results have garnered the attention of AI model developers.
Fireworks AI offers a fast platform designed to build and deploy generative AI. With over 100+ models, Fireworks AI is leveraging the benefits of performance found in OCI using AMD Instinct MI300X.
"Fireworks AI helps enterprises build and deploy compound AI systems across a wide range of industries and use cases," said Lin Qiao, CEO of Fireworks AI. "The amount of memory capacity available on the AMD Instinct MI300X and ROCm open software allows us to scale services to our customers as models continue to grow."
View at TechPowerUp Main Site | Source