The artificial intelligence (AI) chip market is witnessing a boom, with NVIDIA's shares rising and its market value stabilizing at $3.23 trillion, partly due to the support of two key technologies: TSMC's advanced CoWoS packaging and the prevalent High Bandwidth Memory (HBM). NVIDIA's state-of-the-art H200 chip, the first to adopt HBM3E memory specifications, exemplifies this trend.

As AI continues to rise, memory giants such as Samsung, SK Hynix, and Micron have prioritized HBM as one of their key production focuses. The increased demand for HBM has not only invigorated the memory chip industry but also potentially led to shortages and price increases in general-purpose DRAM and intensified technological competition.

DRAM products can be categorized into DDR, LPDDR, GDDR, and HBM, with HBM being primarily driven by the AI market. DDR is mainly used in consumer electronics, servers, and PCs; LPDDR in mobile devices, smartphones, and automotive; and GDDR in GPUs for image processing.

HBM1, introduced in 2014 by SK Hynix, offered a 4-die stack with a bandwidth of 128GB/s and 4GB of memory, outperforming contemporary GDDR5. HBM2, launched in 2018, typically features an 8-die stack, providing 256GB/s bandwidth and 2.4Gbps transfer speeds with 8GB of memory. HBM3, introduced two years ago, increased the number of stacked layers and management channels, offering 6.4Gbps transfer speeds, with a maximum transfer rate of 819GB/s and 16GB of memory. SK Hynix's enhanced version, HBM3E, provides an impressive 8Gbps transfer rate and 24GB capacity, with mass production beginning in 2024.

The HBM supply for this year has been fully booked, and next year's capacity is largely sold out, according to SK Hynix, Samsung, and Micron. These three major manufacturers have initiated a capacity race, with SK Hynix significantly expanding its 5th generation 1b DRAM production to meet the increased demand for HBM and DDR5 DRAM. Samsung announced a 2.9-fold increase in HBM capacity this year, while Micron is building an advanced HBM testing production line in the United States and considering HBM production in Malaysia for the first time.

SK Hynix, with its leading HBM3 product performance, has secured orders from NVIDIA, becoming its main supplier. Samsung is focusing on orders from some cloud customers, and Micron has bypassed HBM3, concentrating on HBM3E products. All three giants are sparing no effort in expanding capacity, with SK Hynix planning to invest up to $74.8 billion by 2028, 80% of which will be allocated to HBM R&D and production, and also advancing the mass production timeline for the next-generation HBM4 chip to next year.

Analysts predict a dynamic shortfall in HBM demand of about 5.5% for this year and 3.5% for next year. However, data from Dolphin Research indicates that HBM is expected to shift from a supply shortage to a supply surplus by the end of this year.