According to Korean media reports, Samsung Electronics has recently completed quality testing of its 12-layer HBM3E products with Broadcom and is now negotiating mass production supply. The current discussions involve an estimated supply volume of approximately 1 billion gigabits (Gb), with mass production expected to begin as early as the second half of this year and extend into next year.
While this volume represents a modest portion of the annual HBM market, it holds strategic significance for Samsung, which is actively seeking to secure stable HBM demand. The company previously set a goal to double its total HBM supply this year to 8-9 billion Gb, up from last year's levels.
The HBM chips will be integrated into next-generation AI processors developed by global tech giants. Currently, Broadcom, leveraging its in-house semiconductor design capabilities, is manufacturing Google's seventh-generation TPU ("Ironwood") and Meta's custom AI chip ("MTIA v3").
Additionally, Samsung is advancing efforts to supply 12-layer HBM3E to Amazon Web Services (AWS), having recently conducted an on-site audit at its Pyeongtaek campus. AWS plans to mass-produce its next-generation AI accelerator, "Trainium 3," featuring this memory solution, in 2025.
|