During the World Economic Forum (WEF) in Davos, Switzerland, on Tuesday, January 20, 2026, Anthropic's Co-founder and CEO Dario Amodei attended related events.
According to two informed sources, Anthropic projected last month that the gross margin for its artificial intelligence products sold to enterprises and application developers would reach 40% in 2025. Although this margin is 10 percentage points lower than earlier optimistic forecasts, it still represents a significant improvement over the previous year.
The shortfall in gross margin expectations stems from the unexpectedly high costs Anthropic incurs for running its AI models for paying customers using servers from Google and Amazon—a process known as inference computing. Financial forecasts indicate that inference costs exceeded initial projections by 23%. Anthropic calculates its gross margin by deducting inference costs and other product sales-related expenses from its revenue.
Key Takeaways: Due to elevated inference computing costs, Anthropic lowered its 2025 gross margin forecast to 40% at the end of last year. The company anticipates 2025 revenue will reach $4.5 billion, a nearly 12-fold increase from 2024. Both Anthropic and OpenAI are pursuing cooperative agreements to manage their computing power expenses.
Anthropic's Claude chatbot and code-generation tool, Claude Code, have recently gained substantial traction. From a gross margin perspective, the company's operational efficiency may lag behind that of its main competitor, OpenAI. However, its margin has improved dramatically from the -94% level recorded in 2024.
These figures highlight that both companies rely on leasing dedicated servers from cloud service providers, a model that complicates the path to achieving net profit. Consequently, both are taking steps to develop or control server hardware in-house while preparing to raise tens of billions of dollars to strengthen their balance sheets. OpenAI recently announced plans to subsidize its chatbot's free users by introducing advertising.
According to an analysis by The Information, if the inference costs for Claude's free users were included, Anthropic's gross margin would drop to approximately 38%, a few percentage points lower than the margin for its paid user business.
In contrast, OpenAI expects a 2025 gross margin of about 46%, a figure that already encompasses the inference costs for both paying and free ChatGPT users. Currently, ChatGPT boasts roughly 900 million weekly active users, with free users accounting for about 95% of that total.
Anthropic had previously forecast that its gross margin would exceed 70% by 2027; OpenAI has set a target to achieve at least a 70% gross margin by 2029. These targets approach the gross margin levels typical of publicly traded software and cloud services companies. However, it is important to note that both AI firms also incur massive computing power rental costs for model training—expenses which are not included in gross margin calculations. This makes their journey to net profitability more challenging compared to traditional software enterprises.
Beyond inference costs, model training also generates enormous expenses. Last month, Anthropic projected its 2025 AI model training costs to be around $4.1 billion, approximately 5% higher than its forecast from last summer. Meanwhile, OpenAI's spending on computing power for model training last year is estimated to have reached $9.4 billion.
Both companies have already taken action to control costs. Anthropic recently struck a deal with Google to purchase $21 billion worth of Tensor Processing Units (TPUs), aiming to gain greater control over hardware resources and thereby reduce computing expenses. OpenAI, on the other hand, is developing its own server chip for inference computing (primarily to power ChatGPT), planned for deployment in the coming years to replace costly Nvidia chips.
This past Wednesday, OpenAI's Chief Financial Officer, Sara Frier, stated that the final design for this upcoming inference chip has been taped out, meaning the company has delivered the final design specifications to the chip manufacturer.
Anthropic recently projected its 2025 revenue will reach $4.5 billion, slightly below an earlier, more optimistic forecast of nearly $4.7 billion. Nevertheless, compared to 2024 revenue of $381 million, this represents a staggering near 12-fold increase—a remarkable growth rate for a company of its size. (For comparison, OpenAI's 2025 revenue is projected to exceed $13 billion, a substantial increase from approximately $3.7 billion in 2024.)
The revenue gap between the two companies is gradually narrowing. Currently, OpenAI's monthly revenue is about $1.7 billion, just over double Anthropic's monthly revenue of $750 million.
Explosive Growth in Code Tools Anthropic's AI products tailored for coding and white-collar work scenarios—Claude Code and Claude Cowork—have emerged as the biggest highlights in the AI sector over the past month. The market fervor they have ignited is comparable to the wave of interest OpenAI generated among enterprise clients and software engineers in early 2023.
According to one informed source, Anthropic has privately disclosed that it now has at least nine clients with annual spending exceeding $100 million. For instance, Microsoft's annual expenditure with Anthropic is expected to reach $500 million, partly to procure services for its GitHub Copilot tool. Furthermore, popular code tool developers like Cursor and Cognition are also major Anthropic customers.
Anthropic anticipates that approximately 86% of its 2025 revenue will come from selling AI models to enterprises via Application Programming Interfaces (APIs), with the remaining revenue derived from subscription services for its Claude chatbot—a product that competes directly with ChatGPT.
OpenAI's enterprise client base is also continually expanding. In fact, if revenue from API sales and enterprise-focused chatbot services are combined, OpenAI's enterprise business scale likely still surpasses that of Anthropic. One source indicated that enterprise clients contribute about 40% of OpenAI's revenue, which would translate to roughly $5.2 billion, higher than Anthropic's $3.9 billion from enterprise clients.
Although both companies are experiencing sales growth at a near-unprecedented pace, some lenders remain cautious about investing in data center projects involving them, as neither company is expected to generate free cash flow until the latter part of this decade.
However, the substantial cash burn has not dampened the confidence of equity investors. According to Anthropic's most optimistic projections, as of mid-December last year, its EBITDA loss was approximately $5.2 billion. OpenAI previously projected an EBIT loss of $21.2 billion, although its anticipated cash burn was estimated to be less than half of that loss amount.
Anthropic is currently advancing a funding round aiming to raise over $10 billion, led by Singapore's GIC and Coatue Management, with a pre-money valuation estimated at $350 billion. Previously, Nvidia and Microsoft committed to investing up to $10 billion and $5 billion in Anthropic, respectively. In a $13 billion funding round in September 2024, Anthropic's pre-money valuation was $170 billion.
Concurrently, OpenAI is also preparing for a massive funding round, seeking to raise up to $100 billion with a pre-money valuation projected around $750 billion.
Comments