Samsung has recently introduced the SOCAMM2 LPDDR5-based memory module, tailored specifically for AI data center platforms. This new memory technology presents several performance enhancements, particularly notable for its application in AI environments.
Initially developed by Dell, the Compression Attached Memory Module (CAMM) has evolved into an industry standard known as SOCAMM2. This version, based on LPDDR5 memory technology, boasts high-performance and high-bandwidth characteristics typically found in devices like smartphones and tablets, while delivering performance equivalent to DDR memory with significantly lower power consumption.
Compared to the standard DDR5 RDIMMs used in servers, SOCAMM2 offers double the bandwidth while consuming 55% less power. Additionally, its compact design allows for denser memory configurations by stacking multiple layers on a single chip, making it smaller than traditional DDR5 modules. The ability to utilize SOCAMM2 both with standard DDR memory and independently as system memory makes it versatile for various applications.
Jim Handy, president of Objective Analysis, notes that SOCAMM addresses genuine industry needs rather than simply repackaging existing technology. Processor manufacturers, particularly Nvidia, support SOCAMM due to its ability to provide a faster interface while accommodating a significant amount of memory in a smaller, power-efficient footprint.
While SOCAMM2’s manufacturing process—utilizing stacked memory—could typically imply higher costs, industry observations suggest that memory vendors are offering similar stack configurations at equivalent prices to traditional DRAM. SK Hynix, another key player in the memory sector, has confirmed plans to support SOCAMM2, adding to its anticipated rollout in conjunction with Nvidia’s Vera Rubin platform scheduled for release in the second quarter of 2026.