‘s AI chips are already the hottest commodity in tech. Now the company is readying its next edition—and it’s likely to make life harder for rivals that are still trying to catch up. No wonder the stock is soaring.
(ticker: NVDA) announced its new H200 Tensor Core GPU. The chip incorporates 141 gigabytes of memory and offers up to 60% to 90% performance improvements versus its current H100 model when used for inference, or generating answers from popular AI models.
The H200-powered systems will be available in the second quarter of 2024 from Nvidia’s hardware partners and major cloud service providers, including
‘s (AMZN) Amazon Web Services,
‘s (GOOGL) Google Cloud,
‘s (MSFT) Azure, and
“With Nvidia H200, the industry’s leading end-to-end AI supercomputing platform just got faster to solve some of the world’s most important challenges,” said Ian Buck, Nvidia’s vice president of hyperscale and high-performance computing.
Nvidia’s GPUs have become well suited for the parallel computations needed to train AI models and serve customers. Its current high-end H100 became available in volume earlier this year and are priced at roughly $25,000 per GPU. The product quickly became the technology industry’s most precious resource as rising excitement over generative artificial intelligence created product shortages.
In recent months, some experts have said
(AMD) upcoming MI300 AI product was also promising for inference applications. It’s not clear how those AMD chips will stand up to Nvidia’s revised H200.
Last month, Nvidia updated its investor presentation, showing that the chip maker is moving from its previous two-year product cycle to a one-year cadence for AI chips. A slide in that document shows Nvidia planning to release additional high-end AI products in 2024 and 2025.
The H200 announcement shows Nvidia’s new strategy of more frequent, higher performance product launches is under way. That’s bad news for its rivals—but good news for the company’s shares.
Nvidia stock has gained 1.4% to $490.15 at 11:46 a.m. Monday.
Write to Tae Kim at [email protected]