The competitive landscape of the AI chip market is undergoing structural transformation. Anthropic is exploring the possibility of developing its own chips, while Amazon is considering selling its internal chip business externally—both developments point to the same trend: technology companies are accelerating their search for alternatives to NVIDIA.
According to media reports citing three informed sources, AI startup Anthropic is evaluating the feasibility of designing its own chips to address the computational power shortages required for more advanced AI systems. Simultaneously, Amazon CEO Andy Jassy indicated that the company is weighing the possibility of selling its self-developed chips to external customers. Amazon's internal chip business is projected to generate annual revenues exceeding $20 billion, and if operated as an independent entity, it could reach approximately $50 billion in annualized revenue.
The simultaneous release of these two pieces of news puts pressure on NVIDIA's dominant position in the AI chip market. As cloud service providers and AI companies increase their investments in custom chips, market research firm TrendForce predicts that the shipment share of ASIC-based AI servers will rise from 27.8% in 2026 to nearly 40% by 2030, indicating long-term erosion for NVIDIA.
It is noteworthy that Anthropic's self-developed chip plan remains in its early stages and may ultimately not materialize, while Amazon's chips are currently offered through AWS on a rental basis. Nevertheless, these developments send a clear signal: major technology firms and AI companies are actively reducing their reliance on NVIDIA.
Anthropic's Chip Development: Early Planning Stage with Cost Barrier Around $500 Million
Anthropic's exploration of self-developed chips comes against a backdrop of rapidly expanding demand. The company disclosed this week that the annualized revenue of its AI model Claude has surpassed $30 billion, compared to just $9 billion by the end of 2025.
Media reports citing industry insiders reveal that the plan is still in its infancy, with the company yet to finalize a specific design approach or assemble a dedicated team. It may ultimately opt for direct procurement rather than in-house development. Industry experts estimate the cost of designing an advanced AI chip at approximately $500 million, covering engineer salaries and quality control investments in the manufacturing process.
Currently, Anthropic sources chips for training and running Claude from multiple providers, including AWS's Trainium, Alphabet's TPU, and NVIDIA's GPU. The exploration of self-developed chips reflects the company's intent to secure greater supply autonomy.
Meanwhile, Anthropic recently signed long-term agreements with Alphabet and Broadcom. Broadcom will provide Anthropic with approximately 3.5 gigawatts of AI computing resources using Alphabet's AI processors, with supply scheduled to commence in 2027. Broadcom has also entered a long-term partnership with Alphabet to develop and supply custom chips and related components for Alphabet's next-generation AI racks, with the collaboration extending until 2031. Exploring self-developed chips while securing external computing resources illustrates Anthropic's dual-track strategy to mitigate supply risks.
Amazon's Chip Sales: Transitioning from Cloud Infrastructure to Semiconductor Supplier
Amazon is studying the feasibility of selling its self-developed chips to external customers, moving beyond the current model of internal rental use within AWS. Andy Jassy stated that the company's internal chip business is expected to generate annual revenues exceeding $20 billion. If spun off as an independent entity supplying semiconductors to AWS customers and other third parties, it could achieve annualized revenues of around $50 billion.
The core driver behind this shift is the supply-demand tension in AI training processors. As computational demand surges, companies are actively seeking alternatives to NVIDIA's products, creating a market opportunity for Amazon to expand its chip sales business.
ASIC Expansion Puts Pressure on NVIDIA's Market Share
Anthropic and Amazon's strategies are not isolated cases. Reports indicate that Meta Platforms, Inc. and OpenAI are also advancing their respective chip development plans, highlighting a clear trend of major technology companies accelerating efforts in custom AI chips.
TrendForce data shows that as cloud service providers like Alphabet and Amazon continue to increase investments in internal chip development, the proportion of ASIC-based AI servers in total AI server shipments will gradually rise from 27.8% in 2026 to nearly 40% by 2030. NVIDIA's share in the AI computing market faces sustained, long-term structural pressure.
Comments