- Joined
- Oct 9, 2007
- Messages
- 47,294 (7.52/day)
- Location
- Hyderabad, India
System Name | RBMK-1000 |
---|---|
Processor | AMD Ryzen 7 5700G |
Motherboard | ASUS ROG Strix B450-E Gaming |
Cooling | DeepCool Gammax L240 V2 |
Memory | 2x 8GB G.Skill Sniper X |
Video Card(s) | Palit GeForce RTX 2080 SUPER GameRock |
Storage | Western Digital Black NVMe 512GB |
Display(s) | BenQ 1440p 60 Hz 27-inch |
Case | Corsair Carbide 100R |
Audio Device(s) | ASUS SupremeFX S1220A |
Power Supply | Cooler Master MWE Gold 650W |
Mouse | ASUS ROG Strix Impact |
Keyboard | Gamdias Hermes E2 |
Software | Windows 11 Pro |
Korean media reports that Samsung Electronics has signed a 4.134 trillion Won ($3 billion) agreement with AMD to supply 12-high HBM3E stacks. AMD uses HBM stacks in its AI and HPC accelerators based on its CDNA architecture. This deal is significant, as it gives analysts some idea of the kind of volumes of AI GPUs AMD is preparing to push into the market, if they know what percent of an AI GPU's bill of materials is made up by memory stacks. AMD has probably negotiated a good price for Samsung's HBM3E 12H stacks, given that rival NVIDIA almost exclusively uses HBM3E made by SK Hynix.
The AI GPU market is expected to heat up with the ramp of NVIDIA's "Hopper" H200 series, the advent of "Blackwell," AMD's MI350X CDNA3, and Intel's Gaudi 3 generative AI accelerator. Samsung debuted its HBM3E 12H memory in February 2024. Each stack features 12 layers, a 50% increase over the first generation of HBM3E, and offers a density of 36 GB per stack. An AMD CDNA3 chip with 8 such stacks would have 288 GB of memory on package. AMD is expected to launch the MI350X in the second half of 2024. The star attraction with this chip is its refreshed GPU tiles built on the TSMC 4 nm EUV foundry node. This seems like the ideal product for AMD to debut HBM3E 12H on.
View at TechPowerUp Main Site | Source
The AI GPU market is expected to heat up with the ramp of NVIDIA's "Hopper" H200 series, the advent of "Blackwell," AMD's MI350X CDNA3, and Intel's Gaudi 3 generative AI accelerator. Samsung debuted its HBM3E 12H memory in February 2024. Each stack features 12 layers, a 50% increase over the first generation of HBM3E, and offers a density of 36 GB per stack. An AMD CDNA3 chip with 8 such stacks would have 288 GB of memory on package. AMD is expected to launch the MI350X in the second half of 2024. The star attraction with this chip is its refreshed GPU tiles built on the TSMC 4 nm EUV foundry node. This seems like the ideal product for AMD to debut HBM3E 12H on.
View at TechPowerUp Main Site | Source