Gf Securities released a research report highlighting that AI advancements, model innovation, and capital expenditures (CAPEX) are driving the coordinated development of the AI industry chain. AI inference is fueling sustained growth in the storage cycle, with expansion and technological upgrades working in tandem.
AI-driven demand continues to push storage prices upward, significantly improving manufacturers' gross margins. DRAM and NAND architecture upgrades are creating new opportunities for equipment demand, while the foundry model in storage is undergoing industrial transformation. Interface chips such as MRDIMM and VPD are unlocking new potential. Gf Securities recommends focusing on related targets in the storage industry chain.
Key insights from the report include:
1. **Storage as the Foundation for Tokens, AI Inference Drives Rapid Growth** AI servers rely on storage solutions like HBM, DRAM, and SSD, which exhibit a trade-off between performance, capacity, and cost. AI inference is accelerating storage demand: - **Memory benefits from ultra-long context and multimodal inference needs**, as high-bandwidth, large-capacity memory reduces access latency and improves parallel efficiency. - **SSD and HDD serve as integral components for tokens**. Lightweight model deployment, driven by surging AI inference demand, is rapidly increasing storage capacity requirements, with projections reaching hundreds of exabytes (EB).
2. **AI Inference Expands Storage Industry Opportunities** - **AI & Storage Servers Boost eSSD Demand**: Long-context inference, RAG databases, and growing token scales are intensifying the need for high-bandwidth, large-capacity eSSDs, expanding market opportunities in AI and storage servers. - **MRDIMM for Large Model Inference**: MRDIMM enhances efficiency in KV Cache scenarios, offering higher concurrency, longer context retention, lower latency, and optimized CPU-GPU memory coordination. - **Growth in SPD & VPD Chips**: DDR5 adoption is driving demand for higher-spec SPD chips, while SSD upgrades present growth potential for VPD EEPROMs. - **CXL Storage Pooling Enhances AI Inference**: CXL technology improves computational efficiency and total cost of ownership (TCO) in KV Cache-intensive tasks. Companies like NVIDIA and Alibaba Cloud are investing in CXL capabilities, boosting demand for interconnect chips.
**Risks**: Potential shortfalls in AI industry growth, server shipments, and domestic technological progress could impact expectations.
Comments