The AI chip titan is projected to earn a record $190 billion in 2026, even though its stock trails its semiconductor rivals. By Al Root
Yes, a $5 trillion company trading at an all-time high can be cheap. It's an opportunity for investors to earn 50% while the company's market capitalization marches toward $8 trillion.
This, of course, is about Nvidia, the artificial-intelligence chip giant that has transformed the stock market and may be on its way to transforming the entire U.S. economy.
As CEO Jensen Huang accompanies President Donald Trump on his state visit to China and Nvidia prepares to report earnings after the close on May 20, the company is on the cusp of earning more than $190 billion in calendar year 2026. That would be the best year for any corporation ever, according to Dow Jones Market Data. Saudi Aramco, which earned about $160 billion in 2022, holds the current record.
Against that setup, how could Nvidia be cheap? For that matter, how could things get even better for the maker of graphics processing units?
For starters, the stock trades for about 24 times earnings expected over the coming 12 months, a 25% discount to its historical value and 5% below the PHLX Semiconductor Index, which includes Advanced Micro Devices, Micron Technology, Taiwan Semiconductor Manufacturing, Intel, and others. Nvidia, as a fast-growing leader, typically trades at a 40% premium to the index.
Since the start of 2025, Nvidia has trailed the index by more than 70 percentage points. Seventy. The market has suddenly forgotten about Nvidia while going gaga over central processing units, or CPUs; memory; and application-specific chips such as Alphabet's TPUs.
But if demand for semiconductors has exploded -- as the earnings and stock performance of countless semiconductor companies imply -- is Nvidia stock really going to be left in the dust? No, and the thesis for buying Nvidia here is quite simple. Investors really just need to believe one thing: This isn't yet as good as it gets for AI computing.
There are several reasons to believe just that.
For starters, look at the earnings reports of countless companies in the semiconductor value chain. "We're still in the early days of the AI infrastructure growth," Jon Kemp, CEO of semiconductor materials supplier Qnity Electronics, tells Barron's, adding that the growth is moving from the data center to edge computing and physical AI applications. "Wherever people are located...the proliferation of [AI] beyond data centers into every other part of the industrial economy is one of the things that gives us confidence in the durability of the growth cycle."
Strange as it may sound, AI applications still haven't achieved widespread use. Just 20% of smartphone users have used an AI agent application on their phones, according to ARK Investment Management's chief futurist Brett Winton. That number will be approaching 100% in three years. That makes 2026 more like 1996 rather than 1999, if investors are thinking about dot-com era comparisons.
AI is still an accelerating arms race. The four hyperscalers -- Amazon.com, Alphabet, Meta Platforms, and Microsoft -- are now expected to spend almost $700 billion building out their AI businesses in 2026. At the start of the year, that number was closer to $500 billion. Looking into 2027, Melius analyst Ben Reitzes expects capital expenditures, including those from the likes of Oracle, to top $1 trillion.
The spending spree is on. How tech giants can afford this is a valid question. The answer: easily, by deploying substantial portions of their operating cash flow into data centers, but not at the expense of positive free cash flow this year. The "big four" are expected to generate earnings before interest, taxes, depreciation, and amortization, or Ebitda, of roughly $800 billion. Their balance sheets are also pristine, with essentially no net debt among them.
If they really wanted to get aggressive, they could add $1 trillion to $2 trillion in debt. (To be sure, there has been complex off-balance-sheet financing of some data centers, such as Meta's in Louisiana, but more can be done.)
As for where Nvidia shares can go, investors can just look at Wall Street's estimates. The average analyst price target is $270, up 20% from recent levels. That works out to 24 times estimated fiscal-year 2028 earnings, which are expected to expand 35% compared with fiscal-year 2027. (Nvidia's year ends in January, so fiscal 2028 is essentially calendar year 2027.)
That target is conservative. If Nvidia were just to trade in line with the 26 average multiple of other semiconductor stocks, it would be a $290 stock. Its historic premium to the semiconductor sector justifies a $390 stock price. That target isn't conservative, but Nvidia shares can easily hit $300 over the coming 12 months, up 33% from recent levels.
To be sure, there are risks. For one, bubble fears persist. Searches for "AI bubble" on Google are up about 400% year over year, having peaked in November. Nvidia's earnings outlook doesn't leave much room for error either. It will have to beat analyst sales growth estimates of almost 80% to satisfy investors. A miss isn't even on the table as Nvidia has beaten sales estimates by an average of about 3% over the last three quarters.
Still, investors' initial reaction to those earnings reports was to sell the stock (though it eventually recovered). There also remains a persistent fear that Nvidia's GPUs could lose share to a host of other chips, including application-specific chips from the hyperscalers and CPUs from Intel that are well-suited for agentic computing.
For now, these prospects are distant. Chip demand is only strengthening. AI infrastructure provider Vertiv Holdings reported a 252% increase in fourth-quarter 2025 orders. The company stopped providing detailed order numbers because backlogs were extending further into the future (and perhaps because investors tend to over-focus on one metric). Automation technology provider Zebra Technologies expects memory chips to be in short supply through 2027. And power-generation equipment maker GE Vernova is booking business into the next decade.
What's more, Arm Holdings and Intel recently said they were unable to meet all demand due to supply-chain constraints. Shortage fears have Tesla and SpaceX CEO Elon Musk building a semiconductor fab to ensure enough chips for all the AI applications he is dreaming up.
The total addressable market for AI computing "is bigger than anyone thinks," says Reitzes. AI agents, which are becoming ubiquitous, will add to demand, for example. These agents, created from Gemini or another AI model, function by constantly inferencing -- or accessing -- AI computing.
Nvidia chips remain the heart of AI systems. GPUs are particularly well-suited for AI computing because they can handle myriad tasks simultaneously, rather than completing them sequentially like a traditional CPU. The accelerator chips from the likes of SpaceX or Alphabet are designed to perform specific tasks with high efficiency, in addition to the computing that Nvidia chips can handle.
The entire AI computing pyramid will grow, and the base of that pyramid will remain Nvidia.
Comments