Kioxia is working with Nvidia to build extremely fast AI SSDs to augment high-bandwidth memory (HBM) by being directly connected to GPUs, with 2027 availability.

As reported by Japan’s Nikkei, Koichi Fukuda, chief engineer of Kioxia’s SSD application technology division, presented at an AI market technology briefing. He said Kioxia, at the request of Nvidia, was developing a 100 million IOPS SSD, with two to be directly connected to an Nvidia GPU to provide a total of 200 million IOPS and partially replace HBM for GenAI workloads. The AI SSD will also support PCIe 7.0.

Fukuda said: “We will proceed with development in a way that meets Nvidia’s proposals and requests.”

Fukuda said the AI SSD would be directly connected to the GPU. He also said it would support PCIe Gen 7, which is four times faster than PCIe 5, and that would indicate a Gen 2 XL-Flash PCIe 7 device could push out 14 million/2.8 million random read/write IOPS. This is a long way short of 100 million IOPS.