By Tae Kim
This article is from the free weekly Barron's Tech email newsletter. Sign up here to get it delivered directly to your inbox.
AI Super Bowl. Hi everyone. This week, 20,000 engineers, scientists, industry executives, and yours truly descended upon San Jose, Calif. for Nvidia's annual GTC developers' conference, which has been dubbed the "Super Bowl of AI."
The event kicked off on Tuesday with Nvidia CEO Jensen Huang's keynote address at the SAP Center -- home to the San Jose Sharks hockey team.
I waded through thousands of people in line outside to get to my seat. It felt like attending a major sporting event or rock concert with the attendees wanting to hear Nvidia's vision for the future of AI and its graphics processing units, or GPUs.
Here are the four key things we learned from Huang's keynote:
Unrivaled Chips. Nvidia isn't letting up. Two years ago, the company accelerated the pace of innovation by going from a two-year AI GPU product release cadence to an annual one. Last month, Huang promised a "big, huge step up" in performance for future chips.
The executive delivered. This week Huang said that its flagship Blackwell Ultra AI server, available later this year, would outperform the current top-of-line GB200 NVL72 by 50%. He then revealed that the Vera Rubin AI server, scheduled for the second half of 2026, would be 3.3 times faster than Blackwell Ultra. Finally, Huang unveiled the Rubin Ultra AI server -- set for late 2027 -- with 14 times the performance of Blackwell Ultra. That figure drew gasps from the GTC audience.
Power Hungry. The Rubin Ultra will incorporate 576 GPUs in one server rack -- four times more than the Blackwell servers available today which have 144 GPUs inside 72 chip packages.
That many chips need a lot more juice. Huang said a Rubin Ultra server rack will draw 600 kilowatts of electricity versus 120 kilowatts for the current model.
The higher power requirements will require a significant engineering rework of data centers, cooling systems, and server designs. That's good news for data center liquid cooling providers like Vertiv Holdings, which Huang called out by name.
Accelerating AI Demand. Earlier this year, the release of AI models from Chinese start-up DeepSeek sparked a wave of market volatility, with questions about whether efficient models from the company could soften demand for Nvidia's chips.
During his keynote, Huang was defiant, declaring that DeepSeek-style AI reasoning features, where the model takes more time to reflect before arriving at a higher-quality answer, are driving a substantial increase in demand.
"Almost the entire world got it wrong," Huang said, emphasizing that compute demand for AI is accelerating, not falling. "The amount of computation we need at this point as a result of agentic AI [and] as a result of reasoning is easily 100 times more than we thought we needed this time last year."
Mega GPU Clusters Are Coming. Superclusters of Nvidia GPUs inside data centers have grown from 16,000 GPUs to over 100,000 GPUs during the past year. And following the keynote, Huang told me he's confident Rubin's rollout will mean one million GPU clusters would be built by 2027.
Overall, Huang outlined an aggressive lineup of new products for the next several years. AI developers clamoring for more computing resources to create their innovative applications should celebrate. Nvidia continues to innovate and seems to be on track to offer the best performance.
For rivals hoping to catch Nvidia, however, the company has once again raised the bar.
"The road map looks really solid, and their capability gap vs. competitors across their entire massive stack continues to widen," Bernstein analyst Stacy Rasgon said on Wednesday in a note covering GTC. "It is still Nvidia's game to lose, and they don't appear to be losing."
This Week in Barron's Tech
-- The Mag 7 Stocks Have Gotten Crushed. Buy These 4 Now. -- Wall Street Isn't Crazy About Alphabet's Deal for Wiz -- Alibaba and Other Chinese Stocks Are Crushing U.S. Shares. Here's Why. -- Micron Stock's 'Upside Is Coming.' It's a Buy Ahead of Earnings. -- Apple Stock Is a 'Lower-Risk Way' to Play AI, Says Analyst
Write to Tae Kim at tae.kim@barrons.com or follow him on X at @firstadopter.
This content was created by Barron's, which is operated by Dow Jones & Co. Barron's is published independently from Dow Jones Newswires and The Wall Street Journal.
(END) Dow Jones Newswires
March 19, 2025 11:48 ET (15:48 GMT)
Copyright (c) 2025 Dow Jones & Company, Inc.