((Automated translation by Reuters, please see disclaimer by Fanny Potkin and Heekyong Yang
Aug. 6 – (A version of Samsung Electronics’ 005930.KS) fifth-generation HBM chips has successfully passed Nvidia NVDA.O’s testing for use in its artificial intelligence (AI) processors, three sources familiar with the results said.
The qualification removes a major hurdle for the world’s largest memory chipmaker, which has been racing to catch up with local rival SK Hynix 000660.KS in the race to supply advanced memory chips that can handle generative artificial intelligence work.
Samsung and Nvidia have yet to sign a supply agreement for the approved eight-layer HBM3E chips, but they will do so soon, the people said, adding that they expect shipments to begin by the fourth quarter of 2024.
The 12-layer version of the South Korean giant’s HBM3E chips, however, has not yet passed Nvidia’s tests, said the people, who declined to be identified because the matter remains confidential.
Samsung and Nvidia both declined to comment.
HBM is a standard type of dynamic random-access memory (DRAM) first produced in 2013, in which chips are stacked vertically to save space and reduce power consumption. A key component of graphics processing units (GPUs) for AI, it helps process massive amounts of data produced by complex applications.
Samsung has been seeking to pass Nvidia’s testing for the HBM3E and previous fourth-generation HBM3 models since last year, but has failed to do so due to heat and power consumption issues, Reuters reported in May, citing sources.
The company has since reworked its HBM3E design to address these issues, according to sources briefed on the matter.
The latest testing approval follows Nvidia’s recent certification of Samsung’s HBM3 chips for use in less sophisticated processors developed for the Chinese market, which Reuters reported last month.
Nvidia’s approval of Samsung’s latest HBM chips comes as demand for sophisticated GPUs created by the generative AI boom soars and Nvidia and other AI chipmakers race to meet it.
According to research firm TrendForce, HBM3E chips are expected to become the most common HBM product on the market this year, with shipments concentrated in the second half of the year. Leading manufacturer SK Hynix estimates that demand for HBM memory chips in general could grow at an annual rate of 82% through 2027.
Samsung forecast in July that HBM3E chips would account for 60% of its HBM chip sales in the fourth quarter, a target that many analysts said could be met if its latest HBM chips receive final approval from Nvidia in the third quarter.
Samsung does not provide a breakdown of revenue for specific chip products. According to a Reuters survey of 15 analysts, Samsung’s total revenue from DRAM chips was estimated at 22.5 trillion won ($16.4 billion) for the first six months of this year, and some estimate that about 10% of that revenue could come from HBM chip sales.
There are only three major HBM manufacturers: SK Hynix, Micron MU.O and Samsung.
SK Hynix has been Nvidia’s main supplier of HBM chips and supplied HBM3E chips in late March to a customer it declined to identify. The shipments went to Nvidia, according to earlier sources.
Micron also said it would supply HBM3E chips to Nvidia.
(1 $ = 1 375.6400 won)