- Joined
- Oct 9, 2007
- Messages
- 47,288 (7.53/day)
- Location
- Hyderabad, India
System Name | RBMK-1000 |
---|---|
Processor | AMD Ryzen 7 5700G |
Motherboard | ASUS ROG Strix B450-E Gaming |
Cooling | DeepCool Gammax L240 V2 |
Memory | 2x 8GB G.Skill Sniper X |
Video Card(s) | Palit GeForce RTX 2080 SUPER GameRock |
Storage | Western Digital Black NVMe 512GB |
Display(s) | BenQ 1440p 60 Hz 27-inch |
Case | Corsair Carbide 100R |
Audio Device(s) | ASUS SupremeFX S1220A |
Power Supply | Cooler Master MWE Gold 650W |
Mouse | ASUS ROG Strix Impact |
Keyboard | Gamdias Hermes E2 |
Software | Windows 11 Pro |
Samsung has reportedly doubled its manufacturing output of HBM2 (high-bandwidth memory 2) stacks. Despite this, the company may still fall short of the demand for HBM2, according to HPC expert Glenn K Lockwood, Tweeting from the ISC 2018, the annual HPC industry event held between 24th to 28th June in Frankfurt, where Samsung was talking about its 2nd generation "Aquabolt" HBM2 memory, which is up to 8 times faster than GDDR5, with up to 307 GB/s bandwidth from a single stack.
While HBM2 is uncommon on consumer graphics cards (barring AMD's flagship Radeon RX Vega series, and NVIDIA's TITAN V), the memory type is in high demand with HPC accelerators that are mostly GPU-based, such as AMD Radeon Instinct series, and NVIDIA Tesla. The HPC industry itself is riding the gold-rush of AI research based on deep-learning and neural-nets. FPGAs, chips that you can purpose-build for your applications, are the other class of devices soaking up HBM2 inventories. The result of high demand, coupled with high DRAM prices could mean HBM2 could still be too expensive for mainstream client applications.
View at TechPowerUp Main Site
While HBM2 is uncommon on consumer graphics cards (barring AMD's flagship Radeon RX Vega series, and NVIDIA's TITAN V), the memory type is in high demand with HPC accelerators that are mostly GPU-based, such as AMD Radeon Instinct series, and NVIDIA Tesla. The HPC industry itself is riding the gold-rush of AI research based on deep-learning and neural-nets. FPGAs, chips that you can purpose-build for your applications, are the other class of devices soaking up HBM2 inventories. The result of high demand, coupled with high DRAM prices could mean HBM2 could still be too expensive for mainstream client applications.
View at TechPowerUp Main Site