AI Chip Mania Sows Seeds of Its Own Destruction -- Streetwise -- WSJ

Dow Jones10:00

By James Mackintosh

Investing in artificial intelligence involves a strong belief that it's different this time, and memory-chip makers are a particularly extreme example.

Micron Technology recorded its biggest-ever loss just three years ago, and is now forecast to become the sixth-most profitable U.S. stock. It will make just under $100 billion over the next 12 months, more than Meta or Berkshire Hathaway. Micron and its rivals are big winners from runaway AI demand and soaring prices for the high-bandwidth memory Micron makes.

Like Samsung Electronics and SK Hynix, it's in the sweet spot of the chip cycle, boosting prices, profits and their stock. It goes outside chips, too: Micron made a notable contribution to Wall Street's upgrades of the S&P 500 earnings outlook, while the two Korean stocks have made the country's market by far the world's best-performing this year.

The question is how long booming demand for memory chips can last.

Memory chips are a perfect example of a highly cyclical industry. Heavy investment is required to build a fabrication plant, or fab. When demand rises, it takes several years for supply to catch up, during which prices and profits jump. Those high profits encourage CEOs to expand supply. And the high fixed costs encourage producers to run fabs at full capacity -- even when supply overshoots demand. The cycle turns when excess supply pushes down prices and profits plunge, as they did in 2022-2023.

Already the high profitability has encouraged heavy capital spending. Micron is spending $150 billion to build or expand fabs in New York, Idaho and Virginia, and new Korean fabs are opening.

The good news is that investors already factor in cyclicality. The bad news is that they've frequently assessed the cycle wrongly at key moments in the past.

The risk of a downturn is embedded in Micron's valuation. Two weeks ago it was the S&P's third-cheapest stock measured by price to forward earnings, and it's still at under 10 times, tame for a highflying stock. That doesn't make it cheap, though. It just means investors recognize that the boom times in memory chips never last.

History shows how this works. In the last cycle Micron stock peaked at the start of 2022, with the forward P/E at just nine times, ahead of a halving in the shares that year. The stock bottomed out and subsequently doubled after the loss was baked into predictions.

Something similar happened in the mid-1980s and 1990s cycles. When the stock peaked in 1984 -- at a level it took another nine years to surpass -- it traded at only 15 times forward earnings. In the 2018 cycle the stock peaked at just 5.5 times. Losses for investors who were fooled into thinking they were buying a bargain were vast.

For now, any danger comes from demand, not supply. New capacity due this year and next isn't enough to crush profits, so long as demand holds up. So what could go wrong?

The biggest risk is impossible to quantify: AI technology could become far more efficient in its use of memory, meaning data centers need less of it. Memory stocks took a hit in March when Alphabet researchers published a paper showing dramatic improvements in memory efficiency, but have recovered. Large language models are an immature technology, and engineering improvements for specialized data centers should be expected -- but how big they are and when they come is unknowable in advance.

Other risks apply to the whole AI supply chain: Data-center plans may be scaled back, AI uptake prove slower than hoped, or a political backlash may hinder expansion. All are plausible; none are considered that serious by the AI bulls driving stock prices.

A final risk is that supercharged profits attract new rivals to enter the market. For now, that seems unlikely in the superfast memory Micron makes, but it's already happening with other highly profitable chips used in AI.

The economics of chip makers outside memory chips are similar, but as their products are more differentiated -- think Nvidia versus AMD -- they are much less cyclical.

Fat margins on Nvidia's chips have persuaded Alphabet to develop what it calls tensor processing units, or TPUs, dedicated to training AI, one of the big uses of Nvidia's expensive graphics processing units (GPUs). Amazon's Graviton chips provide the central processing unit (CPU) for the "inference" involved in using AI models, demand for which has boosted Intel.

Recent entrant Cerebras, which launched the first of its giant chips for both training and inference only in 2019, raised $5.55 billion in its IPO on Thursday, and its shares more than doubled immediately.

While AI demand is soaring, all this extra supply can be absorbed without much effect on profit margins. But the longer it goes on, the more competitors enter, and the more capacity is built.

As with all commodities, success sows the seeds of its own destruction -- even if AI hopes are fulfilled.

Write to James Mackintosh at james.mackintosh@wsj.com

 

(END) Dow Jones Newswires

May 16, 2026 22:00 ET (02:00 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment