Kioxia is working with Nvidia to build extremely fast AI SSDs to augment high-bandwidth memory (HBM) by being directly connected to GPUs, with 2027 availability.
The abandonment of SOCAMM1, if accurate, resets what was expected to be a fast-tracked rollout of modular LPDDR-based memory in Nvidia’s data center stack. SOCAMM has been positioned as a new class of high-bandwidth, low-power memory for AI servers, delivering similar benefits to HBM but at a lower cost.