The abandonment of SOCAMM1, if accurate, resets what was expected to be a fast-tracked rollout of modular LPDDR-based memory in Nvidia’s data center stack. SOCAMM has been positioned as a new class of high-bandwidth, low-power memory for AI servers, delivering similar benefits to HBM but at a lower cost.