The memory chip industry, known for its decades-long "boom-bust-recovery" cyclical pattern, is now entering a new phase driven by the artificial intelligence (AI) spending surge. Industry executives state that AI has structurally disrupted the old cycle, with no signs of price declines in sight. Antonio Neri, CEO of Hewlett Packard Enterprise, commented, "We will continue to raise prices because the entire industry will continue to raise prices. Currently, supply cannot meet demand." An executive from hard drive maker Seagate Technology stated on Tuesday that rising memory chip prices are likely to become the "new normal" for the coming years.
South Korean memory chip giant SK Hynix indicated that the entire memory chip sector is undergoing a structural transformation. A company spokesperson said in a statement, "Our customers, including hyperscale data center operators, are increasingly opting for long-term contracts instead of the more common one-year agreements seen in the past." Micron Technology also noted that customers are now highly willing to sign long-term supply agreements to secure memory supply for future years. Broadcom CEO Hock Tan mentioned on last week's earnings call that he has secured supply through 2028.
Today's AI workloads require a fundamentally different architecture with significantly higher memory demands, far exceeding the industry's historical design foundations. When Meta Platforms announced its own custom AI chip on Wednesday, it also expressed concerns about the supply of the high-bandwidth memory (HBM) it requires. A Meta engineering vice president stated, "We are indeed very concerned about HBM supply. But we believe we have secured the necessary supply for the infrastructure we plan to build."
As demand from hyperscale cloud companies for HBM squeezes out consumer-grade supply, and significant new capacity relief is not expected until at least 2027, AI infrastructure development may have propelled the memory industry into a new era.
**A Historic Memory Chip Shortage** By 2025, with global AI infrastructure construction entering a period of explosive growth, the memory industry is experiencing a "super cycle." The core rationale is twofold: firstly, AI servers demand far greater memory capacity and bandwidth than standard servers. Secondly, industry capacity is shifting towards high-end memory products like HBM, squeezing capacity for traditional memory products and triggering a broad-based price increase wave across the memory sector.
Currently, AI infrastructure build-out continues to accelerate, with large tech companies projected to spend a staggering $650 billion in 2026, approximately 80% higher than last year's record level. According to industry research, data center demand for DRAM accounted for about 50% of global consumption in 2025, compared to just 32% five years ago. This proportion is expected to keep rising. By 2030, AI servers are forecast to constitute over 60% of global memory demand.
AI demand is triggering a historic memory chip shortage. Meeting the exponential growth in chip demand will be extremely costly and may even be unattainable. Although companies like Micron Technology, SK Hynix, and Samsung Electronics are expanding capacity through new or upgraded manufacturing facilities and advanced packaging plants, such projects often require multiple years and billions of dollars in investment to yield significant output.
HBM presents additional challenges – it is exceptionally difficult to manufacture at scale. HBM is created by stacking multiple memory dies (each thinner than a human hair) with micron-level precision. Any single defect can ruin the entire stack, making production slower and yields lower than for conventional DRAM. Some HBM versions also incorporate small logic chips to manage and route data, further adding complexity and consuming substantial manufacturing capacity.
Market research firm IDC stated plainly that as the AI boom pressures supply, the memory chip shortage is becoming "an unprecedented crisis." Executives from companies including Apple, Google, and Tesla have already discussed the impact of the memory chip shortage on profitability and even AI development timelines. The head of Google DeepMind called it an industry "bottleneck." On Tesla's late-January earnings call, CEO Elon Musk even suggested the idea of producing memory chips in-house.
**Is the "Memory Super Cycle" Still Valid?** Previous earnings guidance from Micron Technology and assessments from institutions like Nomura and Citi suggest the current memory chip super cycle could last into 2026 or 2027. However, recent actions by "Wall Street's big short seller," Citron Research, against SanDisk have "poured cold water" on the currently fervent memory chip sector. Citron's targeting of SanDisk represents a reassessment of the prevailing "memory super cycle" logic in the market. The short-selling firm bluntly stated that the market's pricing rationale for SanDisk is fundamentally flawed, and the current memory chip supply tightness is merely a "mirage," with the cycle peak imminent.
Citron provided three main reasons. One is the memory cycle curse. Citron pointed out that the traditional memory chip industry has a clear cyclical nature, having peaked during high-profitability periods in 2008, 2012, and 2018, and the current industry scenario is replicating history. Citron even explicitly stated that there is now twice the capacity ready to come online compared to the peak in 2018; once released, it would completely reverse the supply-demand dynamic. This short report from Citron has raised doubts about the "memory super cycle" and caused short-term sharp volatility in related stocks.
Some analysts believe that, unlike drivers of past cycles, the current memory cycle starting in 2024 is primarily driven by AI, essentially constituting an "AI memory chip cycle." Under the grand AI narrative, companies with the moat of high-end products like HBM have the foundation to transition from cyclical stocks to growth stocks. Standard memory products benefit indirectly from capacity constraints at major manufacturers. However, traditional memory products still primarily used in smartphones and PCs have not escaped strong cyclical characteristics. If overseas memory majors pivot to expand capacity by 2027, the industry still faces the risk of a cyclical reversal, although this inflection point has been significantly delayed by the explosion in AI demand.
Comments