*
Nvidia's ( NVDA ) Rubin AI chip to include GPU, CPU and networking
chips
*
Nvidia ( NVDA ) faces delay with current flagship Blackwell chip
due to
design flaw
*
AI startups say competitive chatbots need fewer Nvidia ( NVDA )
chips
(Adds details from conference in paragraph 1)
By Stephen Nellis and Max A. Cherney
SAN JOSE, California, March 18 (Reuters) - Nvidia ( NVDA )
CEO Jensen Huang took the stage at the company's annual
software developer conference on Tuesday, calling it "the Super
Bowl of AI" and defending the chip designer's lead in artificial
intelligence.
Huang is expected to reveal fresh details about the
company's newest artificial intelligence chip at the conference.
Nvidia ( NVDA ) stock has more than quadrupled in value over the past
three years as the company powered the rise of advanced AI
systems such as ChatGPT, Claude and many others. The stock
dropped 2% on Tuesday.
Much of Nvidia's ( NVDA ) success stemmed from the decade that the
Santa Clara, California-based company spent building software
tools to woo AI researchers and developers - but it was Nvidia's ( NVDA )
data center chips, which sell for tens of thousands of dollars
each, that accounted for the bulk of its $130.5 billion in sales
last year.
Huang hinted last year the new flagship offering will be named
Rubin and consist of a family of chips - including a graphics
processing unit, a central processing unit and networking chips
- all designed to work in huge data centers that train AI
systems. Analysts expect the chips to go into production this
year and roll out in high volumes starting next year.
Nvidia ( NVDA ) is trying to establish a new pattern of introducing a
flagship chip every year, but has so far hit both internal and
external obstacles.
The company's current flagship chip, called Blackwell, is coming
to market slower than expected after a design flaw caused
manufacturing problems. The broader AI industry last year
grappled with delays in which the prior methods of feeding
expanding troves of data into ever-larger data centers full of
Nvidia ( NVDA ) chips had started to show diminishing returns.
Nvidia ( NVDA ) shares tumbled this year when Chinese startup DeepSeek
alleged it could produce a competitive AI chatbot with far less
computing power - and thus fewer Nvidia ( NVDA ) chips - than earlier
generations of the model. Huang has fired back that newer AI
models that spend more time thinking through their answers will
make Nvidia's ( NVDA ) chips even more important, because they are the
fastest at generating "tokens," the fundamental unit of AI
programs.
"When ChatGPT first came out, the token generation rate only
had to be about as fast as you can read," Huang told Reuters
last month. "However, the token generation rate now is how fast
the AI can read itself, because it's thinking to itself. And the
AI can think to itself a lot faster than you and I can read and
because it has to generate so many future possibilities before
it presents the right answer to you."