Samsung Electronics Co., aims to introduce sixth-generation top-performance High Bandwidth Memory4 (HBM4) DRAM chips in 2025 to win the intensifying battle for dominance in the fast-growing artificial intelligence chip segment, an executive of the company said on Tuesday.

Hwang Sang-joon, executive vice president of DRAM product & technology at Samsung, said the company is developing the product while planning to supply samples of the fifth-generation HBM3E to customers.

HBM is a high-capacity, high-performance semiconductor chip, demand for which is soaring as it is used to power generative AI devices like ChatGPT, high-performance data centers and machine learning platforms.

“Samsung commercialized HBM for high-performance computing (HPC) in 2016 for the first time in the world,” Hwang said in a contribution to Samsung Newsroom, the company’s public relations website. “We pioneered the AI memory chip market while mass-producing second- to fourth-generation HBM products.”