Samsung Electronics is making significant strides in the AI memory market with the development of custom high bandwidth memory (HBM) solutions. During the "Samsung Foundry Forum 2024," the company's memory division announced ongoing partnerships with major clients including AMD and Apple to create tailored HBM products. The company expects that these custom HBM solutions will be commercially available as HBM4 enters mass production.

The customized HBM is designed to provide a range of options in performance, power, and area (PPA), delivering greater value than current offerings. One of the key innovations is the 3D stacking of HBM DRAM and customer-specific logic chips, which significantly reduces the power consumption and footprint of semiconductors.

Samsung's custom HBM is integrated directly onto the system-on-chip (SoC), eliminating the need for intermediaries and substrates, thereby substantially reducing power and area. Compared to existing AI chips, the custom HBM processes computations by directly moving data from the SoC to HBM, skipping intermediary steps and effectively minimizing space and power usage.

Furthermore, by transferring memory input/output (I/O) and controller functions from the accelerator to the HBM substrate, more logical space is available for AI functionalities. Samsung anticipates that the custom HBM will be offered in various packaging forms to meet customer needs, including integration on the SoC and the addition of extensive logic functions on the substrate.

During the forum, Samsung's President Choi highlighted the custom HBM in his keynote speech, emphasizing its advantages as an all-in-one solution encompassing IP, middleware, HBM, logic, testing, and packaging.