Samsung Electronics Co.’s 12-layer HBM3E, top and other DDR modules are deployed in Seoul, South Korea, on Thursday, April 4, 2024. Strong sales. Photographer: SeongJoon Cho/Bloomberg via Getty Images
Bloomberg | Bloomberg | Getty Images
Analysts say the supply of high-performance memory chips is likely to remain tight this year as explosive growth in demand for artificial intelligence leads to a shortage of high-performance memory chips.
SK hynix and Micron The world’s two largest memory chip suppliers have sold out of high-bandwidth memory chips for 2024 and are almost sold out of inventory for 2025, according to the companies.
“We expect memory supply to remain tight throughout 2024.
The demand for AI chipsets has promoted the development of the high-end memory chip market, benefiting companies such as Samsung Electronics and SK Hynix, the world’s two largest memory chip manufacturers.Although SK hynix has already NvidiaThe company is also reportedly considering Samsung as a potential supplier.
High-performance memory chips play a crucial role in the training of large language models (LLM) such as OpenAI’s ChatGPT, which has led to a surge in artificial intelligence adoption. The LL.M. needs these chips to remember details of past conversations with users and their preferences to produce human-like responses to queries.
William Bailey, director of Nasdaq IR Intelligence, said: “These wafers are more complex to manufacture and it is difficult to increase production. This may cause shortages throughout the remainder of 2024 and most of 2025.”
Market intelligence firms say HBM’s production cycle is 1.5 to 2 months longer than DDR5 memory chips commonly found in PCs and servers TrendForce said in March.
To meet growing demand, SK hynix plans to expand production capacity through investment Advanced packaging facility in Indiana, USA and in M15X factories in Cheongju and Yongin South Korea’s semiconductor industry is clustered.
Samsung said during its first-quarter earnings call in April that its 2024 HBM bit supply “more than tripled compared to last year.” Chip capacity refers to the number of bits of data that a memory chip can store.
“We have completed discussions with customers on the committed supply. We will continue to expand supply by at least two times or more year-on-year through 2025, and we have successfully negotiated with customers on this supply,” Samsung said.
Micron Technology did not respond to CNBC’s request for comment.
Intense competition
Big tech companies Microsoft, Amazon and Google are spending billions of dollars training their own LL.M.s to stay competitive, fueling demand for artificial intelligence chips.
Chris said: “The big buyers of AI chips – companies such as Meta and Microsoft – have stated that they plan to continue investing resources in building AI infrastructure. This means that they will buy a large number of AI chips, including HBM, at least in 2024.” Le is the author of “Chip Wars,” a book about the semiconductor industry.
Chipmakers are competing fiercely to produce the most advanced memory chips on the market to capitalize on the artificial intelligence boom.
SK Hynix said at a press conference earlier this month that it will start mass production of the latest generation of HBM chips, the 12-layer HBM3E, in the third quarter, and Samsung Electronics also plans to start mass production in the second quarter. Wafer samples.
SK Kim, executive director and analyst at Daiwa Securities, said: “Samsung is currently in the lead in the 12-layer HBM3E sampling process. If they can qualify earlier than peers, I think it can obtain a majority stake in late 2024 and 2025.”