BEIJING, Nov 19 (Reuters) - Nvidia's ( NVDA ) move to
use smartphone-style memory chips in its artificial intelligence
servers could cause server-memory prices to double by late 2026,
according to a report published on Wednesday by Counterpoint
Research.
In the past two months, electronics supply chains around the
world have been hit by a shortage of legacy memory chips as
manufacturers turned their focus to high-end memory chips suited
to semiconductors designed for AI applications.
But Counterpoint, a technology-focused market research firm,
said there was a new problem on the horizon. Nvidia ( NVDA ) recently
decided to reduce AI server power costs by changing the kind of
memory chip it uses to LPDDR, a type of low-power memory chip
normally found in phones and tablets, from DDR5 chips, which are
typically used in servers.
Because each AI server needs more memory chips than a
handset, the change is expected to create sudden demand that the
industry is not equipped to handle, according to Counterpoint.
Nvidia ( NVDA ) is scheduled to release its earnings report later on
Wednesday.
Memory suppliers like Samsung Electronics ( SSNLF ), SK
Hynix and Micron are already facing shortages
of older dynamic random-access memory products after reducing
production to focus on high-bandwidth memory, which is necessary
to make the advanced accelerators that power the global AI boom.
Counterpoint said tightness at the low end of the market was
at risk of spreading upward as chipmakers weigh whether to
divert more factory capacity to LPDDR to meet Nvidia's ( NVDA ) needs.
"The bigger risk on the horizon is with advanced memory, as
Nvidia's ( NVDA ) recent pivot to LPDDR means they're a customer on the
scale of a major smartphone maker - a seismic shift for the
supply chain which can't easily absorb this scale of demand,"
Counterpoint said.
The firm said it expected prices for server-memory chips to
double by the end of 2026. It also forecast that overall memory
chip prices were likely to rise 50% from current levels through
the second quarter of 2026.
Higher server-memory prices would raise costs for cloud
providers and AI developers, potentially adding pressure to
data-centre budgets that are already stretched by record
spending on graphics processing units and power upgrades.