Nvidia earned its way into a dominant position with class-leading products and designs that are the best for modern AI processing.
The most critical event of the quarter for both the technology market and the broader stock market arrives on Wednesday, when Nvidia reports second-quarter earnings.
Nvidia $(NVDA)$ projected during its previous earnings call that fiscal second-quarter 2025 would see revenue of $28 billion with margins right around 75%. If it can hit that revenue target, Nvidia will more than double revenue year over year (revenue in the second-quarter of fiscal 2024 was $13.5 billion) and see yet another quarterly revenue record.
But it would also mean that quarter-to-quarter revenue growth ($2 billion if we go by Nvidia's projection) is down over the past four reports - where revenue has previously increased by $6 billion, $5 billion and $4 billion, respectively.
Add in the rumors and speculation about the delays for its most advanced Blackwell AI chip architecture and you have a recipe for perhaps a disappointing quarterly result in the eyes of investors and the tech industry. Short-sightedness aside, there are still plenty of reasons to believe that Nvidia's growth and trajectory will continue through 2024 and 2025, across the landscape of products, competition and its partners' build-out investments.
Growing competition
Nvidia continues to grow and rake in money, thanks mainly to its dominance in AI chips' training and inference functions. That growth continues to grab the attention of competitors eager to gobble as much of the market as possible for themselves.
AMD $(AMD)$ is the most obvious challenger, with a product line that most closely mirrors the GPU designs of Nvidia, as they compete in both gaming and professional graphics markets. AMD's Instinct brand and MI300X chips have quickly gone from a projected $1.5 billion-revenue product to now $4.5 billion as of the company's most recent earnings call. That is obviously minor compared to Nvidia's $22 billion in AI-attributed sales last quarter, but AMD continues to trend upward, meaning the market is growing and looking for alternatives.
AMD also recently announced the acquisition of ZT Systems , a company that designs and builds complex cloud infrastructure systems. This enables AMD to be quick to market with its new AI chips, offering up complete design packages inclusive of computing IP, networking, cooling and power. AMD will sell off the manufacturing division to not compete with Dell Technologies $(DELL)$ and its other partners, but AMD is clearly still investing to improve its ramp into AI markets.
Broadcom $(AVGO)$ also is making noise in the AI space, primarily with its AI-ready network products. News came out last week that the company was going to be designing custom chips in partnership with OpenAI with target customers of Meta Platforms (META) and Alphabet's $(GOOG)$( GOOGL) Google, two of Nvidia's biggest customers. And of course, the custom silicon projects from Amazon.com $(AMZN)$, Google, and Microsoft $(MSFT)$ continue to innovate and execute, looking for ways to counterbalance the strength of Nvidia in this space.
Building out the AI infrastructure
Big money invested in new hardware benefits Nvidia.
While competitors are still fighting to get into the AI chip space that Nvidia has been building and profiting from for years now, Nvidia's biggest customers are investing in historic infrastructure buildouts.
A report earlier this month shows data that is mouthwatering if you are an Nvidia competitor, and eyewatering if you are one of its biggest customers. With the major cloud-services providers all on trajectory to spend tens or hundreds of billions of dollars over the coming decade to prepare the data centers of the future, a huge amount of that spending is going directly into Nvidia's pocket.
The report indicates that as much as 40% of the money Microsoft and Meta are spending is going to Nvidia, with Tesla $(TSLA)$, Google and Amazon in the 10%-30% range. Despite the reservations about this kind of powerful AI compute translating into revenue in the short term, the fact is that these companies (and more) are continuing to invest big dollars in new hardware, and Nvidia will be the primary beneficiary for some time.
Nvidia's leadership is hard to challenge
Nvidia earned its way into a dominant position with class-leading products and designs that are the best for modern AI processing. This is a combination of both the AI chips themselves but also the deep investment in the software infrastructure to enables developers and cloud providers to easily integrate Nvidia hardware into their workflows. Even if other companies can catch up on hardware, with as-good performance or efficiency, it is the software moat that Nvidia holds for its business that causes the most angst amongst competitors.
Even recent rumors of delays to the latest Blackwell AI chips from Nvidia don't seem to be opening any big windows for AMD, Intel $(INTC)$, or anyone else. All indications are that Nvidia's current-generation Hopper H100 chips are going to maintain pure performance leadership through the rest of the year. And when the Blackwell chip does make it into the hands of customers, it will position Nvidia for leadership through 2025 - at least.
Ryan Shrout is the President of Signal65 and founder at Shrout Research. Follow him on X @ryanshrout. Shrout has provided consulting services for AMD, Qualcomm, Intel, Arm Holdings, Micron Technology, Nvidia and others. Shrout holds shares of Intel.
Comments