NEW YORK, March 19 (Reuters) - Facebook owner Meta
Platforms ( META ) is not expecting to receive shipments of
Nvidia's ( NVDA ) new flagship artificial intelligence chip this
year, a Meta spokesperson told Reuters.
Nvidia ( NVDA ), the dominant designer of GPU (graphics processing
unit) chips needed to power most cutting-edge artificial
intelligence work, announced the new B200 "Blackwell" chip at
its annual developer conference on Monday.
The chip maker said the B200 is 30 times speedier at tasks
like serving up answers from chatbots, although it did not give
specific details about how well it performs when chewing through
huge amounts data to train those chatbots, which is the kind of
work that has powered most of Nvidia's ( NVDA ) soaring sales.
Nvidia's ( NVDA ) Chief Financial Officer Colette Kress told
financial analysts on Tuesday that "we think we're going to come
to market later this year," but also said that shipment volume
for the new GPUs would not ramp up until 2025.
Social media giant Meta is one of Nvidia's ( NVDA ) biggest
customers, after buying hundreds of thousands of its previous
generation of chips to support pushes into amped-up content
recommendations systems and generative AI products.
Meta CEO Mark Zuckerberg disclosed in January that the
company planned to have about 350,000 of those earlier chips,
called H100s, in its stockpile by the end of the year. In
combination with other GPUs, he added, Meta would have the
equivalent of about 600,000 H100s by then.
In a statement on Monday, Zuckerberg said Meta planned to
use Blackwell to train the company's Llama models. The company
is currently training a third generation of the model on two GPU
clusters it announced last week, which it said each contain
around 24,000 H100 GPUs.
Meta planned to continue using those clusters to train Llama
3 and would use Blackwell for future generations of the model,
the Meta spokesperson said.