By Adam Levine
It's hard to describe this past week's initial public offering of Cerebras Systems as anything but a resounding success, a reflection of the company's artificial-intelligence chip technology, and the fortuitous timing smack in the middle of a fierce semiconductor rally. On May 4, Cerebras announced an expected price range with a midpoint of $120. A week later it upped the midpoint to $155. Two days after that, Cerebras priced its IPO at $185 a share.
The stock began trading on Thursday at $350, and it closed its first day at $311 -- that's 159% higher than bankers' initial price point.
When I asked CEO Andrew Feldman what happened this week to take the price up so sharply, he credited the product. "There was nothing in particular that happened. I think people came to realize that this was a very interesting company that had a unique advantage," he said. "Many hadn't looked at us seriously before, and so the book was 25 times oversubscribed."
At a fully diluted market capitalization of around $100 billion, Cerebras is now as valuable as Synopsys, ServiceNow, or Adobe, and it sits just outside the top 100 companies in the S&P 500 index. That said, with 2025 sales of just $510 million, Cerebras investors are pricing in a lot of growth.
Cerebras has a unique product design that solves a longstanding problem in high-performance computing: the bottleneck connecting computing and memory. For anyone who's seen the inside of a desktop PC, there's a CPU chip sitting in its own place in the motherboard, connected to separate slots filled with memory modules. For years, the setup worked well as a standardized form factor that allowed memory to be upgraded or replaced, but the siloed approach adds latency when computer processing calls on the memory -- something that happens a lot in AI computing.
Six years ago, when Apple jettisoned Intel chips in Macs and began using its own, the company saw an opportunity to rethink that bottleneck. At the cost of upgradability and repairability, the latest Macs essentially solder computing and memory chips together, allowing for more speed in data-rich applications. The fast memory access is a key reason techies are buying out desktop Macs to use for AI.
In the AI data center, Nvidia does something similar while adding expensive high bandwidth memory to its industry-leading AI chips. But none of it can match the connection between computing and memory if they sit side by side on the same chip, and that's what Cerebras has done in its product known as Wafer-Scale Engine 3.
While a single 300 millimeter silicon wafer normally produces a few dozen chips, Cerebras uses the entire wafer for one chip, about the size of a dinner plate. Pairing compute and memory allows for unmatched speed for any memory-intensive task that doesn't leave the confines of a single Wafer-Scale Engine. The Cerebras memory connection is about 1,000 times faster than what's used in Nvidia's coming Vera chips.
There's a downside that comes with the setup, though. Speeds between Cerebras chips aren't particularly fast -- far slower than what Nvidia strings together. For that reason, Cerebras servers haven't hosted the most advanced AI models.
But according to Feldman, Cerebras has overcome this limitation using new software. In the next six to eight weeks the company expects to show its servers hosting OpenAI's largest and most advanced model, an important step if Cerebras really wants to challenge Nvidia and all the other AI-chip contenders like Advanced Micro Devices, Alphabet, Microsoft, Amazon.com, and Meta Platforms. Feldman says it has accomplished this without giving up its primary memory-speed advantage. "Still blisteringly fast," he said. "Still 15 to 18 times faster than the competition."
Just as Cerebras is shaking up chip design, it's also taking a new approach to post-IPO liquidity. Typically, insiders and private investors agree to wait for six months before selling their shares after an IPO.
Cerebras, though, has decided to greenlight share sales based on 11 triggers over the opening six months of trading, with some selling allowed almost immediately. Only about 15% of shares outstanding were sold in the offering. Before the six months are out, another 78% of shares will become eligible for sale. Among the sale triggers: first-quarter earnings likely in June, second-quarter earnings, and then a series of dates set two weeks apart starting in late August.
By laying out the schedule in the IPO prospectus, Feldman says Cerebras is trying to manage expectations around new shares hitting the market.
Lockups have been a tricky topic for new stocks. Trading in Figma, the design-software firm that went public in July, saw a huge pop in initial trading. But shares tanked as the lockup expiration approached. Six months after the offering, Figma shares had fallen 35% from the IPO price.
"There developed a whole group of algorithmic traders that made big profit on the fluctuations leading up to and on the 181st day, and that helped nobody," Feldman says. "We would rather add supply to the market in an organized and titrated way rather than over a cliff."
Demand for Cerebras shares is off the charts, but new supply may be coming very soon. It could make for a bumpy first six months for the stock.
Write to Adam Levine at adam.levine@barrons.com
This content was created by Barron's, which is operated by Dow Jones & Co. Barron's is published independently from Dow Jones Newswires and The Wall Street Journal.
(END) Dow Jones Newswires
By Adam Levine
It's hard to describe this past week's initial public offering of Cerebras Systems as anything but a resounding success, a reflection of the company's artificial-intelligence chip technology, and the fortuitous timing smack in the middle of a fierce semiconductor rally. On May 4, Cerebras announced an expected price range with a midpoint of $120. A week later it upped the midpoint to $155. Two days after that, Cerebras priced its IPO at $185 a share.
The stock began trading on Thursday at $350, and it closed its first day at $311 -- that's 159% higher than bankers' initial price point.
When I asked CEO Andrew Feldman what happened this week to take the price up so sharply, he credited the product. "There was nothing in particular that happened. I think people came to realize that this was a very interesting company that had a unique advantage," he said. "Many hadn't looked at us seriously before, and so the book was 25 times oversubscribed."
At a fully diluted market capitalization of around $100 billion, Cerebras is now as valuable as Synopsys, ServiceNow, or Adobe, and it sits just outside the top 100 companies in the S&P 500 index. That said, with 2025 sales of just $510 million, Cerebras investors are pricing in a lot of growth.
Cerebras has a unique product design that solves a longstanding problem in high-performance computing: the bottleneck connecting computing and memory. For anyone who's seen the inside of a desktop PC, there's a CPU chip sitting in its own place in the motherboard, connected to separate slots filled with memory modules. For years, the setup worked well as a standardized form factor that allowed memory to be upgraded or replaced, but the siloed approach adds latency when computer processing calls on the memory -- something that happens a lot in AI computing.
Six years ago, when Apple jettisoned Intel chips in Macs and began using its own, the company saw an opportunity to rethink that bottleneck. At the cost of upgradability and repairability, the latest Macs essentially solder computing and memory chips together, allowing for more speed in data-rich applications. The fast memory access is a key reason techies are buying out desktop Macs to use for AI.
In the AI data center, Nvidia does something similar while adding expensive high bandwidth memory to its industry-leading AI chips. But none of it can match the connection between computing and memory if they sit side by side on the same chip, and that's what Cerebras has done in its product known as Wafer-Scale Engine 3.
While a single 300 millimeter silicon wafer normally produces a few dozen chips, Cerebras uses the entire wafer for one chip, about the size of a dinner plate. Pairing compute and memory allows for unmatched speed for any memory-intensive task that doesn't leave the confines of a single Wafer-Scale Engine. The Cerebras memory connection is about 1,000 times faster than what's used in Nvidia's coming Vera chips.
There's a downside that comes with the setup, though. Speeds between Cerebras chips aren't particularly fast -- far slower than what Nvidia strings together. For that reason, Cerebras servers haven't hosted the most advanced AI models.
But according to Feldman, Cerebras has overcome this limitation using new software. In the next six to eight weeks the company expects to show its servers hosting OpenAI's largest and most advanced model, an important step if Cerebras really wants to challenge Nvidia and all the other AI-chip contenders like Advanced Micro Devices, Alphabet, Microsoft, Amazon.com, and Meta Platforms. Feldman says it has accomplished this without giving up its primary memory-speed advantage. "Still blisteringly fast," he said. "Still 15 to 18 times faster than the competition."
Just as Cerebras is shaking up chip design, it's also taking a new approach to post-IPO liquidity. Typically, insiders and private investors agree to wait for six months before selling their shares after an IPO.
Cerebras, though, has decided to greenlight share sales based on 11 triggers over the opening six months of trading, with some selling allowed almost immediately. Only about 15% of shares outstanding were sold in the offering. Before the six months are out, another 78% of shares will become eligible for sale. Among the sale triggers: first-quarter earnings likely in June, second-quarter earnings, and then a series of dates set two weeks apart starting in late August.
By laying out the schedule in the IPO prospectus, Feldman says Cerebras is trying to manage expectations around new shares hitting the market.
Lockups have been a tricky topic for new stocks. Trading in Figma, the design-software firm that went public in July, saw a huge pop in initial trading. But shares tanked as the lockup expiration approached. Six months after the offering, Figma shares had fallen 35% from the IPO price.
"There developed a whole group of algorithmic traders that made big profit on the fluctuations leading up to and on the 181st day, and that helped nobody," Feldman says. "We would rather add supply to the market in an organized and titrated way rather than over a cliff."
Demand for Cerebras shares is off the charts, but new supply may be coming very soon. It could make for a bumpy first six months for the stock.
Write to Adam Levine at adam.levine@barrons.com
This content was created by Barron's, which is operated by Dow Jones & Co. Barron's is published independently from Dow Jones Newswires and The Wall Street Journal.
(END) Dow Jones Newswires
May 15, 2026 18:44 ET (22:44 GMT)
Copyright (c) 2026 Dow Jones & Company, Inc.
Comments