AI CPU Demand Drives DDR5 Premiums, Extending Memory Chip Super Cycle Potentially to 2027

Deep News05-02 13:18

The shift in AI inference architecture is reshaping the demand landscape for memory chips, with the duration of this supply-demand imbalance likely exceeding previous market expectations.

According to recent industry reports, the launch of AI CPUs by companies like Intel, featuring up to 400GB of memory, is causing a sharp increase in demand for DDR5 on the server side. Analysts note that current production capacities at Samsung Electronics and SK Hynix are struggling to keep pace with the combined demand from both GPUs and CPUs. The DRAM supply shortage is now projected to persist until 2027.

Market signals are already visible in spot prices. Data from South Korean securities firms show that in April, spot prices for DDR5 (16GB) increased by 2.8% month-over-month, while traditional DDR4 prices fell by 16% during the same period, leading to a widening price gap between the two.

Industry insiders indicate that the current supply gap in the DRAM market is approximately 10% of total demand. As demand for general-purpose DRAM rises alongside High Bandwidth Memory (HBM) requirements, the end of the memory chip "super cycle" may be delayed from the previously expected 2026 to 2027.

**CPU Emerges as "AI Coordinator," Doubling Memory Needs**

The core driver of this demand expansion is the strategic shift in the AI industry from training to inference.

Previously, AI data centers were built around GPUs as the core computing infrastructure, with server configurations typically pairing 8 GPUs with 1 CPU, focusing on large-scale parallel training tasks. However, as inference scenarios grow more complex, the CPU's role is evolving from a supporting processor to an "AI coordinator"—responsible for managing multiple AI agent systems, overseeing module outputs, and coordinating overall workflows.

A key aspect of this role change is "context memory." CPUs need to store and reference outputs from various AI agents in real-time to coordinate complete inference processes, making large memory capacity a critical requirement. An Intel executive recently stated in an earnings call that the computing power ratio between CPUs and GPUs in AI inference infrastructure has shifted from 1:8 to 1:4 and is narrowing further toward 1:1.

In response, CPU manufacturers are increasing DRAM configurations in AI CPUs to 300–400GB, up to four times higher than the 96–256GB configurations in traditional CPU products.

**Combined GPU and CPU Demand Widens DDR5 Supply Gap**

The competition for memory capacity is spreading from GPUs to CPUs, leading to snowballing demand.

On the GPU side, NVIDIA's next-generation AI chip "Vera Rubin" incorporates 288GB of memory through 8 HBM stacks, while AMD's upcoming MI400 GPU will feature 432GB. Google's recently released eighth-generation Tensor Processing Unit (TPU 8i) also includes 288GB of HBM.

On the CPU side, once Intel's "Xeon" and AMD's "Epyc" series AI CPUs begin large-scale adoption of DDR5 with capacities up to 400GB, the supply-demand imbalance for general-purpose DRAM will intensify further. Unlike HBM, which is supplied by a limited number of manufacturers like SK Hynix, the expansion in DDR5 demand will directly impact the supply balance across the entire general-purpose DRAM market.

Price divergence in the spot market clearly reflects this structural shift: DDR5 prices are strengthening against the trend, while DDR4 prices remain under pressure. The contrasting performance of these two product types highlights an accelerating migration toward the new generation standard.

**Samsung and SK Hynix Face Capacity Strain, Super Cycle Outlook Revised Upward**

Supply-side constraints make it difficult to alleviate the memory shortage in the short term.

As leading global DRAM suppliers, Samsung Electronics and SK Hynix face limitations in capacity expansion due to fab construction cycles and yield improvements in advanced processes. With HBM capacity already largely allocated, the effective production capacity available for general-purpose DDR5 is relatively limited, making it challenging to quickly respond to the incremental demand driven by AI CPUs.

Industry experts point out that the overall DRAM market is currently facing a supply shortfall of about 10%. Commercial-grade DRAM prices have more than doubled from their lows, driving historic profits for memory manufacturers like Samsung and SK Hynix. As demand from both GPUs and CPUs continues to compound, market expectations for the duration of the super cycle are being revised upward—extending from an initial projection of 2026 to 2027, indicating that the memory chip industry's upcycle may last longer than previously anticipated.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment