Nvidia introduces next chip to power AI revolution

Nvidia has unveiled the B200 Blackwell, the next most powerful AI chip the world has ever seen. However, investors weren’t overly impressed.

The new graphics processing unit (GPU) for AI applications contains 208 billion transistors split across two cores. While not comparable, to better imagine this number, the human brain has about 86 billion neurons.

Two chips in a single GPU will interchange information with 10 terabytes per second interconnect link.

However, complete systems may include hundreds of such GPUs.

A multi-node and liquid-cooled rack-scale system from Nvidia called GB200 NVL72 combines 36 Grace Blackwell “Superchips.” The platform acts as a single GPU.

Data centers will be able to connect up to 576 new GPUS in a single system with 1.8TB/s bidirectional throughput per GPU, according to Nvidia’s press release.

Nvidia claims their new platform can run trillion parameter AI models at up to 25 times less cost and energy consumption compared to the predecessor, Hopper, which had 80 billion transistors in a single core.

“The GB200 NVL72 provides up to a 30x performance increase compared to the same number of NVIDIA H100 Tensor Core GPUs for LLM inference workloads and reduces cost and energy consumption by up to 25x,” Nvidia claims.

Big Tech will get the new tools first.

“AWS, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure will be among the first cloud service providers to offer Blackwell-powered instances, as will Nvidia Cloud Partner program companies Applied Digital, CoreWeave, Crusoe, IBM Cloud, and Lambda,” Nvidia said.

Nvidia is shifting from selling single chips to selling total systems. A rack with 72 AI chips and 36 central processors contains 600,000 parts in total and weighs 3,000 pounds (1,361 kg).

Nvidia aims to extend dominance

Nvidia Chief Executive Jensen Huang on Monday kicked off his company's annual developer conference with a slew of announcements designed to keep the chip maker in a dominant position in the artificial-intelligence industry. He detailed a new set of software tools to help developers sell AI models more easily to companies that use technology from Nvidia, whose customers include most of the world's biggest technology firms.

Nvidia's chip and software announcements at GTC 2024 will help determine whether the company can maintain its 80% share of the market for AI chips.

“I hope you realize this is not a concert,” Huang said, wearing his signature leather jacket and joking that the day's keynote would be full of dense math and science.

Huang gave no price details on the B200 “Blackwell” systems, which will start shipping later this year.

Altogether, Huang's announcements failed to provide new fuel for a rally in which Nvidia's shares have surged 240% over the past 12 months, making it the U.S. stock market's third-most valuable company, behind only Microsoft and Apple. Nvidia stock dipped 1.4% in extended trade, while Super Micro Computer, which makes AI-optimized servers with Nvidia's chips, fell 4%. Advanced Micro Devices stock dipped nearly 3% during the keynote.

Tom Plumb, CEO and portfolio manager at Plumb Funds, whose largest holdings include Nvidia, said the Blackwell chip was not a surprise.

“But it reinforces that this company is still at the cutting edge and the leader in all graphics processing. That doesn't mean the market is not going to be big enough for AMD and others to come in. But it shows that their lead is pretty insurmountable,” said Plumb.

Many analysts expect Nvidia's market share to drop several percentage points in 2024 as new products from competitors come to market and Nvidia's largest customers make their own chips.

“Rivals like AMD, Intel, startups, and even Big Tech's own chip aspirations threaten to chip away at Nvidia's market share, particularly among cost-conscious enterprise customers,” said Insider Intelligence analyst Jason Bourne.

Though Nvidia is widely known for its hardware offerings, the company has built a significant battery of software products as well.

The new software tools, called microservices, improve system efficiency across a wide variety of uses, making it easier for a business to incorporate an AI model into its work, just as a good computer operating system can help apps work well.

In addition to AI software, Nvidia dived deeper into software for emulating the physical world with 3-D models. For work on designing cars, jets, and products, Huang also announced partnerships with design software companies Ansys, Cadence, and Synopsys. Shares of the three companies jumped around 3% in extended trade following Huang's comments.

Nvidia also introduced a new line of chips designed for cars with new capabilities to run chatbots inside the vehicle. The company deepened its already-extensive relationships with Chinese automakers, saying that electric vehicle makers BYD and Xpeng will both use its new chips.

Toward the end of his keynote speech, Huang also outlined a new series of chips for creating humanoid robots, inviting several of the robots made using the chips to join him on the stage.