Since December last year, the global frenzy over the generative AI known as ChatGPT has rapidly increased the demand for high-performance DRAM, also called High Bandwidth Memory (HBM), capable of processing large-scale data.

According to sources in the semiconductor industry on July 31, domestic memory semiconductor companies such as Samsung Electronics and SK hynix are pushing for the expansion of dedicated HBM lines. HBM, also referred to as “high-performance DRAM,” enhances data capacity and speed more than tenfold compared to conventional DRAM. Memory semiconductor companies complete HBM by vertically stacking the produced DRAM wafers and connecting them electrically through advanced packaging processes that drill holes in the chips.

The two companies plan to invest more than 2 trillion won by the end of next year to more than double the current production capacity of the HBM line. SK hynix is planning to utilize the spare space in the Cheongju factory following its existing HBM production base in Icheon. Samsung Electronics is considering expanding the core line of HBM in Cheonan, South Chungcheong Province, where the Advanced Packaging team under the Device Solutions Division is located.