NVIDIA CEO Jensen Huang was resolute. At his company's annual GTC developers conference this past week in San Jose, Calif., he laid out a compelling vision for the artificial-intelligence industry while presenting an aggressive road map of coming products from his company. The announcements could leave chip rivals racing to catch up for years to come.
As I walked around the GTC exhibit floor, there was a palpable sense of excitement, with hundreds of people lining up for sessions and panels to hear about the latest AI advances in everything from robotics and healthcare to cutting-edge water-cooled server designs.
Yet, there is a disconnect. Despite the enthusiasm for the future of AI and how Nvidia semiconductors are central to it all, Nvidia shares have treaded water, trading at just 26 times forward price-to-earnings. That's an undemanding valuation for a company projected to boost revenue by 57% this year.
It's driven by three concerns: that AI chip demand could soften after the release of Chinese start-up DeepSeek's efficient models; rising chip competition from Broadcom; and uncertainty over President Donald Trump's threats to put tariffs on chip imports.
At GTC, Huang confidently addressed all three issues, arguing that none of them would impede Nvidia's bright prospects.
With respect to DeepSeek, Huang was particularly defiant, pushing back on the notion that DeepSeek would hurt demand for graphics processing units, or GPUs. During his GTC keynote address on Tuesday, he said the reasoning capability in DeepSeek's AI model, which takes more time to reflect before arriving at a higher-quality answer, is driving a substantial increase in demand for compute resources. That type of reasoning is increasingly used in most of the top AI models.
"Almost the entire world got it wrong," Huang said. "The amount of computation we need at this point as a result of agentic AI, as a result of reasoning, is easily 100 times more than we thought we needed this time last year."
It's a stunning point: One hundred times more compute needed than Nvidia expected just 12 months ago should put to rest questions about near-term demand.
The noise has grown louder when it comes to AI chip competition, as Broadcom CEO Hock Tan frequently tells Wall Street that his company will gain its "fair share" of the AI chip market by 2027 by helping large technology companies design their own AI semiconductors called application-specific integrated circuits, or ASICs.
At GTC, Huang pushed back. "A lot of ASICs get canceled," he replied when I asked him about Broadcom following his Tuesday GTC keynote. "The ASIC still has to be better than the best. How do they know it's going to be the best, so that it will be deployed in volume?"
The clear subtext? Broadcom's offerings won't be competitive with Nvidia.
Broadcom didn't respond to a request for comment about Huang's remarks.
On the question of tariffs, Huang said at a press event Wednesday that he isn't expecting a significant impact on the company's financials or outlook. He said Nvidia has an agile network of suppliers and can move orders to lower-tariff countries as needed, adding that Nvidia plans to bring more manufacturing to the U.S. over time.
In general, Nvidia made the case that the overall market opportunities for AI and AI data center infrastructure are expanding rapidly. Huang expects the industry will spend roughly $500 billion on data center capital expenditures this year, rising to more than $1 trillion by 2028, with Nvidia's GPU chip business gaining a larger share of the spending in the coming years.
Part of that will come from the growing number of Nvidia GPUs inside data centers. These so-called superclusters have grown from 16,000 GPUs to over 100,000 GPUs during the past year. Huang told me he's confident that several million GPU clusters would be built by 2027.
Then there's robotics. Nvidia executive Rev Lebaredian told me we're just at the beginning of an exponential ramp-up in the development of AI robotics. The combination of rising computing power and smarter AI models is making large advances in robotics possible. He believes there will be millions of humanoid robots in use, especially by industrial companies, within five years. I have no particular insight into whether robots are, in fact, imminent. But if it happens, it's one more degree of upside for Nvidia, which makes the hardware brains for robots.
Ultimately, the biggest development from GTC was Nvidia's aggressive product road map. During his keynote, Huang announced that the company's Blackwell Ultra AI server, available later this year, would outperform the current model by 50%. Then he said that the Vera Rubin AI server, scheduled for the second half of 2026, would be 3.3 times faster than Blackwell Ultra. The showstopper was the unveiling of the Rubin Ultra AI server -- set for late 2027 -- with 14 times the performance of Blackwell Ultra. That figure drew gasps from the audience.
Somehow, Nvidia stock barely moved on the news and closed lower on Tuesday amid a general market decline. As a longtime Nvidia watcher, I'm confounded by the lack of enthusiasm from Wall Street. The tech crowd understood the significance; eventually investors will, too.