Micron Technology reported extraordinary results for its fiscal 2026 second quarter on March 18th. Revenue nearly tripled year-over-year to approximately $239 billion, while gross margin soared to a record 75%. The company also provided third-quarter guidance projecting a gross margin as high as 81%. Concurrently, Micron announced that capital expenditures for fiscal 2026 will exceed $250 billion, with construction-related spending in fiscal 2027 expected to increase by over $100 billion compared to the previous year. Following the report, Micron's stock initially rose but then declined after hours, with a maximum drop of nearly 5%.
The core market concern revolves around capital allocation. During the subsequent earnings call, investor questions centered on three primary areas: the rationale for initiating such a massive capital expenditure plan now, the sustainability of the 81% gross margin guidance, and Micron's precise positioning within the AI chip supply chain for NVIDIA.
Substantial capital expenditures have sparked market apprehension. "We anticipate a significant increase in capital expenditures for fiscal 2027," stated Micron CEO Sanjay Mehrotra. The guidance for fiscal 2026 capital spending surpasses $250 billion, well above analyst expectations of $224 billion. This rapid spending pace has raised investor concerns about future free cash flow and potential overcapacity, with high costs overshadowing strong sales projections.
Management provided the underlying industry logic: this spending is imperative. Sanjay explained that the increase is largely driven by capital expenditures for cleanroom facilities, including expansions at the Tongluo facility in Taiwan, China, and US-based fabs. The insatiable demand for High Bandwidth Memory (HBM) driven by AI is consuming existing capacity. Furthermore, he noted that bit growth per wafer from node transitions is decelerating, meaning technological upgrades alone are insufficient to meet market demand; substantial investment in new fabs and Extreme Ultraviolet (EUV) lithography equipment is necessary. With new capacity taking years to build, Micron must invest now to meet AI demand anticipated for 2027 and 2028.
To mitigate risks associated with massive investments, Micron is altering its business model. Sanjay Mehrotra announced the signing of the company's first five-year Strategic Customer Agreement (SCA), a significant shift from the typically one-year Long-Term Agreements (LTA) of the past. He explained that these multi-year agreements include specific commitments, providing greater visibility and stability for Micron's business model while ensuring supply guarantees for customers in an extremely tight environment.
The market is highly focused on Micron's progress in HBM and its partnership with NVIDIA. Sanjay Mehrotra confirmed that Micron began volume production and shipment of its HBM4 36GB 12-High (12H) product in the first quarter of fiscal 2026, specifically designed for NVIDIA's Vera Rubin architecture. "We expect to reach mature yields for HBM4 faster than we did for HBM3E," Mehrotra emphasized. Additionally, Micron is developing the next-generation HBM4E, expected to enter volume production in 2027, further solidifying its position in the high-end AI accelerator supply chain.
Micron's third-quarter gross margin guidance of 81% astonished Wall Street. Analysts repeatedly questioned its sustainability and future pricing logic during the Q&A session. CFO Mark Murphy's response highlighted a fundamental shift in the memory market: "AI is a transformative, long-term driver. AI requires not just more memory, but higher-performance memory." This challenges the traditional memory cycle of price spikes from supply-demand mismatches followed by crashes from overcapacity. Sanjay added a impactful statement: "AI is not just increasing demand for memory; it is fundamentally reshaping memory into a decisive strategic asset for the AI era."
A "barrel effect" exists in the market: the computational bottleneck for large AI models has shifted from compute chips to memory bandwidth and capacity. Without faster, larger memory, AI cannot further reduce token computation costs or support more complex agentic AI reasoning. Demand is surging not only for HBM but also for traditional data center SSDs. The call revealed that Micron's data center NAND revenue doubled sequentially in the second quarter, driven by increased adoption of AI use cases like vector databases and KV cache offload, with demand far exceeding available supply.
Regarding the severity of the shortage, Mehrotra was direct: "Supply is extremely tight, and it is tight across end markets." When pressed on actual customer allocation, he confirmed that the situation from last quarter persists: "For some key customers, in the medium term, we can only meet 50% to two-thirds of their demand." Supply-side constraints were repeatedly cited, including cleanroom limitations, long construction cycles, increased HBM wafer consumption ratio (trade ratio), slowing bit growth per wafer from advanced node transitions, and low inventory levels. Addressing margin sustainability, Murphy stated plainly, "We have indicated that market conditions are expected to remain very tight beyond 2026."
The company also provided a 2026 industry bit shipment outlook to support the "tight balance" narrative: DRAM bit shipments are expected to grow in the low-20% range (slightly above prior expectations), while NAND bit shipments are projected to grow approximately 20%.
When asked if high prices could suppress demand in consumer electronics like PCs and smartphones, Mehrotra acknowledged potential impact but affirmed the configuration trend remains intact. He stated that price-sensitive markets "may see some demand affected by higher prices," but overall demand remains "quite strong." The company aims to maintain a diversified supply across end markets rather than concentrating solely on data centers. However, Micron provided a specific near-term assessment: in 2026, due to constrained DRAM and NAND supply, unit shipments of PCs and smartphones "could decline to low double-digit percentage ranges."
The medium-to-long-term mitigating factor is increasing memory content per device. Micron highlighted several configuration upgrades driven by on-device AI: PCs with agentic AI capabilities recommend "at least 32GB" of memory, roughly double the average PC; emerging personal AI workstations feature "128GB configurations"; and the share of flagship smartphones with 12GB or more DRAM surged to nearly "80%" in the recent quarter, compared to "less than 20%" a year ago.
Comments