
NVIDIA Preparing "SOCAMM" Memory Standard for AI PCs Similar to Project DIGITS
NVIDIA and its memory partners, SK Hynix, Samsung, and Micron, are preparing a new memory form factor called System on Chip Advanced Memory Module—SOCAMM shortly. This technology, adapting to the now well-known CAMM memory module standard, aims to bring additional memory density to NVIDIA systems. Taking inspiration from NVIDIA's Project DIGITS, it has now been developed independently by NVIDIA outside of any official memory consortium like JEDEC. Utilizing a detachable module design, SOCAMM delivers superior specifications compared to existing solutions, featuring 694 I/O ports (versus LPCAMM's 644 and traditional DRAM's 260), direct LPDDR5X memory substrate integration, and a more compact form factor.
For reference, current-generation Project DIGITS is using NVIDIA GB10 Grace Blackwell Superchip capable of delivering one PetaFLOP of FP4 compute, paired with 128 GB of LPDDR5X memory. This configuration limits the size of AI models that run locally on the device, resulting in up to 200 billion parameter models running on a single Project DIGITS AI PC. Two stacked Project DIGITS PCs are needed for models like Llama 3.1 405B with 405 billion parameters. Most interestingly, memory capacity is the primary limiting factor; hence, NVIDIA devotes its time to developing a more memory-dense SOCAMM standard. Being a replacement compatible with LPDDR5, it can use the same controller silicon IP with only the SoC substrate being modified to fit the new memory. With NVIDIA rumored to enter the consumer PC market this year, we could also see an early implementation in consumer PC products, but the next-generation Project DIGITS is the primary target.
For reference, current-generation Project DIGITS is using NVIDIA GB10 Grace Blackwell Superchip capable of delivering one PetaFLOP of FP4 compute, paired with 128 GB of LPDDR5X memory. This configuration limits the size of AI models that run locally on the device, resulting in up to 200 billion parameter models running on a single Project DIGITS AI PC. Two stacked Project DIGITS PCs are needed for models like Llama 3.1 405B with 405 billion parameters. Most interestingly, memory capacity is the primary limiting factor; hence, NVIDIA devotes its time to developing a more memory-dense SOCAMM standard. Being a replacement compatible with LPDDR5, it can use the same controller silicon IP with only the SoC substrate being modified to fit the new memory. With NVIDIA rumored to enter the consumer PC market this year, we could also see an early implementation in consumer PC products, but the next-generation Project DIGITS is the primary target.