December 24, 2024

Nvidia CEO talks next-generation semiconductors and computing

NVIDIA Chief Executive Jensen Huang told CNBC’s Jim Cramer on “Squawk on the Street” on Tuesday that the next-generation artificial intelligence graphics processor, called Blackwell, will cost between $30,000 and $30,000 per unit. Between $40,000.

“We had to invent some new technology to make this possible,” Huang said, holding up a Blackwell chip. He estimates Nvidia spent about $10 billion on research and development.

The price suggests the chip could be popular for training and deploying artificial intelligence software like ChatGPT, and will be priced in a similar range to its predecessor, the H100 (called the Hopper), which cost between $25,000 and $25,000, according to analyst estimates. Each wafer costs $40,000. The Hopper generation, launching in 2022, represents a significant price increase for Nvidia’s artificial intelligence chips compared to the previous generation.

Later, Huang told CNBC’s Kristina Partsinevelos that the costs were not only related to the chips, but also to designing the data centers and integrating into other companies’ data centers.

Nvidia releases a new generation of artificial intelligence chips approximately every two years. The latest products, such as Blackwell, are generally faster and more power-efficient, and Nvidia is using the publicity of the new generation of GPUs to win orders for the new GPUs. Blackwell combines two wafers and is larger than its predecessor.

Since OpenAI’s ChatGPT announced the start of the artificial intelligence craze at the end of 2022, Nvidia’s AI chips have driven Nvidia’s quarterly sales to triple. Over the past year, most of the top AI companies and developers have been using Nvidia’s H100 to train their AI models. For example, Yuan This year, the company said it was buying hundreds of thousands of Nvidia H100 GPUs.

Nvidia did not disclose the list price of its chips.They are available in several different configurations with prices suitable for the end consumer, such as the Meta or Microsoft The fee paid depends on a variety of factors, such as the number of chips purchased, or whether the customer goes directly through the complete system or through a supplier (e.g. Dell, life value Or Supermicro, which builds artificial intelligence servers. Some servers are equipped with up to eight AI GPUs.

On Monday, Nvidia announced at least three different versions of its Blackwell AI accelerator — the B100, B200 and GB200 — that pair two Blackwell GPUs with Arm-based CPUs. They have slightly different memory configurations and are expected to ship later this year.

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *