Nvidia unveils flagship AI chip, the B200, aiming to extend dominance

SAN JOSE, California,  (Reuters) – Nvidia NVDA.O Chief Executive Jensen Huang yesterday kicked off his company’s annual developer conference with a slew of announcements designed to keep the chip maker in a dominant position in the artificial-intelligence industry.

On a hockey arena stage in the heart of Silicon Valley, Huang introduced Nvidia’s latest chip, which is 30 times speedier at some tasks than its predecessor.

He also detailed a new set of software tools to help developers sell AI models more easily to companies that use technology from Nvidia, whose customers include most of the world’s biggest technology firms.

Nvidia’s chip and software announcements at GTC 2024 will help determine whether the company can maintain its 80% share of the market for AI chips.

“I hope you realize this is not a concert,” Huang said, wearing his signature leather jacket and joking that the day’s keynote would be full of dense math and science.

It was a nod to how Nvidia, once mostly known among computer gaming enthusiasts, has earned recognition on par with tech giants like Microsoft MSFT.O and has become a Wall Street standout, with sales that more than doubled in its most recent fiscal year to surpass $60 billion.

Nvidia’s new flagship chip, called the B200, takes two squares of silicon the size of the company’s previous offering and binds them together into a single component.

While the B200 “Blackwell” chip is 30 times speedier at tasks like serving up answers from chatbots, Huang did not give specific details about how well it performs when chewing through huge amounts data to train those chatbots – which is the kind of work that has powered most of Nvidia’s soaring sales. He also gave no price details.

All together, Huang’s announcements failed to provide new fuel for a rally in which Nvidia’s shares have surged 240% over the past 12 months, making it the U.S. stock market’s third-most valuable company, behind only Microsoft and Apple AAPL.O.

Nvidia stock dipped 1.4% in extended trade, while Super Micro Computer SMCI.O, which makes AI-optimized servers with Nvidia’s chips, fell 4%. Advanced Micro Devices AMD.O stock dipped nearly 3% during the keynote.

Tom Plumb, CEO and portfolio manager at Plumb Funds, whose largest holdings include Nvidia, said the Blackwell chip was not a surprise.

“But it reinforces that this company is still at the cutting edge and the leader in all graphics processing. That doesn’t mean the market is not going to be big enough for AMD and others to come in. But it shows that their lead is pretty insurmountable,” said Plumb.

Nvidia said major customers including Amazon.com AMZN.O, Alphabet’s GOOGL.O Google, Microsoft, OpenAI and Oracle ORCL.N, are expected to use the new chip in cloud-computing services they sell, and also for their own AI offerings.

Nvidia also is shifting from selling single chips to selling total systems. Its latest iteration houses 72 of its AI chips and 36 central processors. It contains 600,000 parts in total and weighs 3,000 pounds (1,361 kg).

Many analysts expect Nvidia’s market share to drop several percentage points in 2024 as new products from competitors come to market and Nvidia’s largest customers make their own chips.

“Rivals like AMD, Intel INTC.O, startups, and even Big Tech’s own chip aspirations threaten to chip away at Nvidia’s market share, particularly among cost-conscious enterprise customers,” said Insider Intelligence analyst Jason Bourne.

Though Nvidia is widely known for its hardware offerings, the company has built a significant battery of software products as well.

The new software tools, called microservices, improve system efficiency across a wide variety of uses, making it easier for a business to incorporate an AI model into its work, just as a good computer operating system can help apps work well.