Cerebras's IPO will be fresh a test of investor excitement for AI infrastructure

Dow Jones05:20

MW Cerebras's IPO will be fresh a test of investor excitement for AI infrastructure

By Britney Nguyen

The inference-chip maker recently upsized its IPO and the expected pricing range - a sign of strong demand 'for something not named Nvidia'

Shares of inference-chip maker Cerebras will begin trading on Thursday.

Cerebras Systems' initial public offering could become the largest IPO so far this year when it prices later on Wednesday - and some are seeing it as a test of how much further the artificial-intelligence trade can go.

The inference-chip maker recently raised its offering to 30 million shares, from 28 million shares, and hiked the expecting per-share pricing range to between $150 and $160, from between $115 and $125 previously - reflecting surging interest from investors ahead of its listing on the Nasdaq under the ticker symbol "CBRS." The stock is expected to begin trading on Thursday.

At the high end of those ranges, Cerebras could be looking at a $4.8 billion capital raise, which would make it the largest IPO in the world so far this year in terms of gross proceeds, according to Dow Jones Market Data. The chip maker is also reportedly looking at a $48.8 billion market valuation, which would be almost 100 times its 2025 revenue of $510 million.

"It's a test of the AI-infrastructure boom, because Cerebras will be priced based on future expectations," Greg Martin, managing director of private markets at Rainmaker Securities, told MarketWatch.

The potential $48.8 billion valuation would be more than double Cerebras's latest $23 billion valuation following a $1 billion Series H funding round in February. Martin said that trading on the secondary markets was "relatively soft" before the chip company's roadshow, so it looks as though the market is "expecting a major pickup in growth."

Cerebras's chips are built to focus on inference, or the process of running AI models after training. Unlike Nvidia (NVDA) and other AI-chip makers, the company's approach to chip making involves using an entire wafer as one chip, rather than splitting it into multiple individual dies.

This technique, known as wafer-scale integration, allows Cerebras "to bring together quantities of compute and memory never before assembled on a single commercial chip and deliver AI at previously unimaginable speeds," the company said in its S-1 filing with the U.S. Securities and Exchange Commission.

At 58 times larger than Nvidia's B200 chip, Cerebras can include on-chip static random-access memory, which it said is faster than the dynamic random-access memory in Nvidia's Blackwell package.

Nvidia's graphics processing units have dominated the market for AI chips for training. However, the shift to inference and agentic AI looks to be changing the hardware competition. In December, Nvidia entered a nonexclusive licensing agreement with inference-chip maker Groq.

Martin said that Nvidia will remain a major player in the AI-chip market, especially with the "moat" provided by its software ecosystem that allows developers to work with its GPUs.

"But there has to be competition - the market's too big," Martin said, pointing to compute capacity constraints being felt by AI developers, including OpenAI rival Anthropic.

The massive need for compute is likely driving Cerebras's high valuation, Martin said, but he expects that to be tested after the company goes public and releases its first earnings report. On the other side, Martin added, investors will be watching hyperscaler revenues, as well as those from OpenAI and Anthropic, to gauge whether or not there will be long-term support for the AI-infrastructure buildout.

Meanwhile, Martin noted that Cerebras has major customer concentration, which he sees as a risk.

The company's filing highlighted major partnerships with OpenAI and Amazon Web Services (AMZN). The chip maker's multiyear deal, announced in January, to deploy 750 megawatts with ChatGPT maker OpenAI is valued at more than $20 billion. The two companies also plan to co-design upcoming AI models to work with Cerebras's future hardware. In March, the company entered a multiyear partnership with AWS to deploy its Cerebras CS-3 solution in the cloud giant's data centers.

OpenAI itself has questions around financial stability, Martin noted, which creates more risk factors for Cerebras. But the high valuation "goes to show how strong the demand is in the public markets for something not named Nvidia," he added.

The listing will also be a gauge on the "receptivity of the IPO markets for new AI-infrastructure issuances," Martin said, especially as SpaceX is expected to go public later this year.

"It's going to set the stage for a really interesting IPO year, and it's going to test how excited we are about the future of AI infrastructure at levels that are mind-blowing," he said.

The Renaissance IPO ETF IPO has rallied 9% in 2026, while the S&P 500 index SPX has advanced 8.8%.

-Britney Nguyen

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

 

(END) Dow Jones Newswires

May 13, 2026 17:20 ET (21:20 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment