• 12
  • 1
  • Favorite

Nvidia’s Groq Deal Is an Instagram Moment—With 1 Big Risk

Dow Jones12-27 10:00

Nvidia’s purchaseof a nonexclusive license from Groq, the artificial-intelligence chip start-up, has raised a lot of eyebrows. For now, many questions remain unanswered, including the exact contours of the deal. But the big one is that Nvidia chips dominate AI computing, so why would it want other chips?

Nvidia’s move is similar to Facebook—now Meta Platforms—buying Instagram in 2012. It was a purely defensive move, as CEO Mark Zuckerberg thought the upstart social network represented a threat to Facebook’s singular place in social media. Zuckerberg was determined that Facebook not become another Myspace flash-in-the-pan. Defensive or not, the move worked out well for Meta, with Instagram having a younger demographic than Facebook and contributing to the 3.5 billion people worldwide who use at least one Meta app every day.

Facebook’s success with Instagram should offer Nvidia shareholders some comfort about the Groq deal. Still, the move is an admission from Nvidia that it sees competitive threats on the horizon and wants to get ahead of them. It may, in the end, keep Nvidia’s sales growth going, but it is likely to come at the cost of reduced profitability.

Nvidia’s chips, known as GPUs, are the main workhorses of the AI computing revolution. The company’s quarterly sales rose from $5.9 billion in the quarter before ChatGPT’s release in November 2022 to almost 10 times that in the most recent quarter. Nvidia’s pre-eminent position can be seen in its stellar gross margin of 73% in the latest quarter.

GPUs are all-purpose AI chips: They can be used both for training AI models as well as running those models for chatbots, image generation, and the like. Nvidia’s chips are considered the best, giving the company significant pricing power. But the competition is beginning to catch up.

Alphabet’s Google was the first to make competing AI hardware, the TPU, in 2015. The other major clouds, Microsoft’s Azure and Amazon Web Services, have their own custom AI chips, and there are also many start-ups chasing the brass ring. Groq was founded by a former Google TPU engineer who will now be working for Nvidia as part of the deal.

These companies have largely given up on competing with Nvidia in the training realm. But more and more, AI computing demand comes from running these models, something known as inference. That’s where the competing chips may gain some traction.

The deal “implies NVDA recognition that while GPU dominated AI training, the rapid shift towards inference could require more specialized chips,” Bank of America analyst Vivek Arya wrote in a note on Friday.

Groq is firmly in that group of inference-facing hardware. It calls its chips LPUs, and they are designed for fast and efficient inference. Because its big potential customers—the large cloud companies—have their own inference chips, the future seemed bleak. The Nvidia deal gives Groq new life.

The combination opens up a few possibilities. In the future, we may see Nvidia rack servers that feature both GPUs and LPUs. Groq will now have the benefit of all the AI software that Nvidia has built up over two decades and continues to produce. Groq also brings its own inference software into the mix. The two chips could become more complementary than competitive, the way Facebook and Instagram are now.

But it also signals an inflection point in the AI investment boom. If inference becomes the main AI workload—and it increasingly happens on LPUs, TPUs and the like—Nvidia needs to be a part of it.

For now, investors don’t seem worried about the impact on the world’s largest chip maker. Nvidia shares were up 1% on Friday.

But the company’s golden gross margin will come under threat if Groq LPUs become a bigger part of the sales mix. Groq may help Nvidia fight off the coming challenge in inference, but it will come at a cost. In the latest deal, Nvidia may have just conceded that its current gross margin can’t last forever.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Report

Comment1

  • ZhongRenChun
    ·12-27 20:08
    Groq LPU are 16 times faster than GPU.  Let's hope the LPU can finally catch on and give us much better performance.   The groq can generate replies in milliseconds,  faster than any human could react or speak.  Images can be generated in less than 1 second,  almost real-time.  Im surprised LPU hasn't caught on yet. GPU were never design for AI, they are horribly inefficient for AI.   Groq was a massive breakthrough with their massive cache giving revolutionary performance.  Hope Nvidia will make use of it and not just kill them off like voodoo.
    Reply
    Report
 
 
 
 

Most Discussed

 
 
 
 
 

7x24