Key Insights
-
The AI inferencing market is smaller than the AI training market, but it remains an exciting growth area for investors.
-
I am constructive nearly all the players in the custom chip market, including $Broadcom(AVGO)$ , $ARM Holdings Ltd(ARM)$ , $Microsoft(MSFT)$ , $Alphabet(GOOG)$ , $Amazon.com(AMZN)$ and $Taiwan Semiconductor Manufacturing(TSM)$
-
While Nvidia is expected to dominate the AI training market with over 90% market share in the coming years, I believe the AI inferencing market will be largely captured by hyperscalers, primarily due to the lower costs and increased energy efficiency of using custom AI chips.
Full Analysis
-
$NVIDIA Corp(NVDA)$ ’s GPUs have been essential in powering AI development and driving the company’s market cap to $3.5 trillion.
-
Nvidia’s data center revenue (including AI chips) ballooned from $3.6 billion in Q4 2023 to $26 billion in Q2 2025, following the launch of ChatGPT, with its overall profit margin increasing from 26.5% in Q4 2023 to 62.5% in Q2 2025.
-
However, the high cost of AI chips like the H100 has prompted customers such as $Meta Platforms, Inc.(META)$ , $Microsoft(MSFT)$ , $Amazon.com(AMZN)$ , $Alphabet(GOOGL)$ andOpenAI to explore developing their own AI chips with the help of Broadcom and Marvell.
-
Currently, custom AI chip design for data centers is dominated by Broadcom and Marvell.
-
While $Advanced Micro Devices(AMD)$ and $NVIDIA Corp(NVDA)$ are setting up custom chip units to help companies develop AI chips, most hyperscalers still prefer working with $Broadcom(AVGO)$ and $Marvell Technology(MRVL)$ .
-
Most companies currently prefer Nvidia’s AI chips for training, trusting that Nvidia offers the best AI capabilities, and they fear falling behind in AI performance if they switch to alternatives like AMD’s MI300X or Intel’s Gaudi 2. Nvidia’s CUDA software, exclusive to its GPUs, remains the leader in AI training tools.
-
In summary, most companies will continue to prefer $NVIDIA Corp(NVDA)$ ’s GPUs for AI training, but will likely use custom AI chips for inference.
-
Custom AI chips can perform better, cost less, and consume less energy than general-purpose Nvidia GPUs, particularly for inference and energy-sensitive applications.
-
The AI training and inference markets are expected to grow to $471 billion and $169 billion respectively by 2032, with a CAGR of 31% and 48% from 2022 to 2032.
Chip Design Companies Poised to Benefit from the Custom AI Chip Boom
-
Broadcom is the leading custom chip designer, with nearly 60% market share.
-
Broadcom forecasts revenue from AI to be $12 billion for FY 2024, driven by Ethernet networking and custom AI chips.
-
It is widely rumored that the top two custom AI customers are Alphabet and Meta, with the third-largest customer anticipated to be ByteDance, Apple, or Amazon.
-
Marvell is reported to command a 15% market share in custom ASICs.
-
Analysts expect Marvell's AI revenues could reach $1.5 billion in FY 2025, while Marvell projects its AI revenues will hit $2.5 billion by FY 2026.
-
However, Marvell is still unprofitable, and its smaller market share makes it a riskier bet compared to Broadcom.
-
Most custom AI chips are built on ARM architecture.
-
ARM's RISC architecture is generally preferred over x86's CISC due to its superior energy efficiency, scalability, and lower latency, making it more suitable for AI workloads.
-
It is widely reported that ARM plans to launch AI chips in 2025.
-
Currently, ARM provides circuit architecture for processors in smartphones and GPUs, but it may expand into designing AI chips and then subcontracting their manufacturing in 2025. The company primarily earns revenue from royalties when companies use its designs, but it may also begin competing with Broadcom and Marvell in chip design.
-
There are reports that ARM’s AI chip business could eventually be spun off and come under SoftBank's umbrella.
-
ARM is positioned to be one of the primary beneficiaries of the rapidly growing custom AI chip market.
Hyperscalers Poised to Benefit from the Custom AI Chip Boom
-
Other cloud providers are developing their own custom chips, primarily to supplement Nvidia’s offerings for their own use.
-
However, I believe they will gradually begin to offer these chips to their cloud customers as alternatives to Nvidia’s GPUs.
-
Currently, cloud providers allow customers to subscribe to cloud computing services using both Nvidia’s GPUs and their own custom chips.
-
Recently, Amazon signed a five-year deal with Databricks, where Databricks will utilize Amazon's Trainium AI chips to build customized AI models and systems, such as chatbots, for corporate customers. Databricks claims that Amazon's custom AI chips will cost 40% less than competing hardware.
-
Current Amazon custom AI chip customers include:
Trainium: Anthropic
Inferentia: Databricks, Airbnb, ByteDance, Snap, and Deutsche Telekom
Graviton: SAP
-
Amazon remains one of the most aggressive players in encouraging companies to adopt its custom-designed AI chips.
2) $Alphabet(GOOGL)$ $Alphabet(GOOG)$
-
While Apple has not explicitly stated that it does not use Nvidia chips, it has indicated that its AI models are trained on Google’s custom chips—TPUs.
-
It is widely reported that Apple is Google’s largest custom chip customer.
-
Amazon acquired Annapurna Labs in 2015 to enhance its chip design capabilities, while Google began using TPUs internally in 2015 with assistance from Broadcom.
-
Thus, some argue that Google’s TPUs are superior to Amazon’s and Microsoft’s custom chips.
-
Reports suggest it is more cost-efficient for Google to run its search queries using its TPUs than Nvidia’s H100.
-
Microsoft announced plans to build its own custom AI chip in November 2023. Therefore, its custom AI chips are primarily for internal use and are not yet offered to third parties.
-
However, Microsoft Azure, being the second-largest cloud provider in the world, may eventually open its custom chips for third-party use.
Chip Manufacturer Poised to Benefit from the Custom AI Chip Boom
$Taiwan Semiconductor Manufacturing(TSM)$
-
Most AI chips will be manufactured by TSMC.
-
Samsung and Intel currently lack the technological capability to produce chips at competitive costs and good yield rates.
Conclusion
-
The AI inferencing market is smaller than the AI training market, but it remains an exciting growth area for investors.
-
I am constructive nearly all the players in the custom chip market, including $Broadcom(AVGO)$ , $ARM Holdings Ltd(ARM)$ , $Microsoft(MSFT)$ , $Alphabet(GOOG)$ , $Amazon.com(AMZN)$ and $Taiwan Semiconductor Manufacturing(TSM)$
-
While $NVIDIA Corp(NVDA)$ is expected to dominate the AI training market with over 90% market share in the coming years, I believe the AI inferencing market will be largely captured by hyperscalers, primarily due to the lower costs and increased energy efficiency of using custom AI chips.
Comments