Micron Technology, alongside other leading artificial intelligence (AI) semiconductor companies like Intel, has ramped up efforts to challenge industry front runners. According to semiconductor industry sources on Dec. 24, Micron, the third-largest DRAM manufacturer globally, is currently undergoing quality assessment from major clients for its developed HBM3E.

High Bandwidth Memory (HBM) is a type of DRAM used in AI servers, characterized by stacking multiple types of DRAM to enhance data processing capacity and speed. The global HBM market, presently dominated by Samsung Electronics and SK hynix, is projected to grow from approximately 2.5 trillion won this year to about 8 trillion won by 2028.

With a 2023 forecasted market share of about 5%, Micron stands third. The company is betting big on its next-generation product, HBM3E, to close the gap with the leaders. Sanjay Mehrotra, CEO of Micron, stated, “We are in the final stages of validation to supply HBM3E for Nvidia’s next-generation AI accelerators.”

The AI accelerator market, which includes an HBM and GPU packaged together, is also witnessing fierce competition. AI accelerators are semiconductors specialized for large-scale data training and inference, considered essential in the generative AI era.

Intel is exerting significant effort in this area. On Dec. 14, Intel unveiled a prototype of its Gaudi3 next-generation AI accelerator, featuring up to four times the speed of its predecessor and a 1.5-times increase in HBM. Pat Gelsinger, CEO of Intel, has been openly critical of Nvidia, a leader in AI accelerators, stating, “As the importance of data inference [services] grows, Nvidia’s era, focused on data training, will also come to an end.”

The third-place counterattack in various AI semiconductor sectors is driven by the market’s growth potential. According to AMD, the AI accelerator market, currently valued at US$30 billion, is expected to expand to US$150 billion by 2027. The HBM market is also predicted to grow by 50% annually over the next five years.

Customers’ desires to check the dominance of leading companies is also spurring aggressive market entries by newer players. For instance, major AI accelerator market players like Microsoft are urging companies like AMD and Intel to develop Nvidia alternatives. In the HBM market, one client has supported Micron’s new product development with an advance payment of US$600 million.

Market leaders are focusing on widening their lead with new products. In the HBM market, Samsung Electronics and SK hynix are developing the sixth-generation HBM, HBM4, which is expected to incorporate “hybrid bonding” technology, allowing for increased capacity while reducing size. Nvidia is working on the “X100” AI accelerator, set to be released in 2025, which will boost memory usage to 400 gigabytes.