SEOUL, Oct 31 (Reuters) - Samsung Electronics ( SSNLF )
said on Friday it is in "close discussion" to supply
its next-generation high-bandwidth memory (HBM) chips, or HBM4,
to Nvidia ( NVDA ), as the South Korean chipmaker scrambles to
catch up with rivals in the AI chip race.
Samsung, which plans to market the new chip next year, did
not specify when it aims to ship the latest version of its HBM
chip, a key building block of artificial intelligence chipsets.
Local rival SK Hynix, Nvidia's ( NVDA ) top HBM chip
supplier, on Wednesday said it aims to start shipping its latest
HBM4 chips in the fourth quarter and expand sales next year.
Nvidia ( NVDA ), in a statement announcing cooperation with Samsung
and other Korean companies, said it is in "key supply
collaboration for HBM3E and HBM4", without elaborating.
Samsung has been slower to capitalise on the AI-driven
memory chip boom, leading to weaker earnings performance and a
reshuffle of its chip division last year. Its earnings recovered
this quarter, driven by conventional memory chip demand.
This week it said it sells its current-generation HBM3E
chips to "all related customers", indicating it has joined
rivals in supplying the latest 12-layer HBM3E chips to Nvidia ( NVDA ).
The launch of HBM4 chips will be a major test of Samsung's
ability to regain its edge in the market, analysts said.
HBM - a type of dynamic random access memory (DRAM) standard
first produced in 2013 - involves stacking chips vertically to
save space and reduce power consumption, helping to process the
large volume of data generated by complex AI applications.