Samsung Electronics reported a staggering 756% year-on-year surge in operating profit for the first quarter of 2026, driven by robust demand for memory chips fueled by the artificial intelligence boom. The company not only achieved a record high but also disclosed during its earnings call that conventional DRAM is currently more profitable than HBM (High Bandwidth Memory).
Boosted by sustained demand for generative AI and supercomputing capabilities, Samsung's latest Q1 2026 results revealed record total revenue of 133.9 trillion Korean won (approximately $901.3 billion) and an impressive operating profit of 57.23 trillion won, marking a 756% annual increase.
The standout performance was primarily driven by the memory business under the AI wave. According to the conference call, the semiconductor division (DS) saw operating profit skyrocket to 53.7 trillion won, an extraordinary 48-fold year-on-year growth, contributing about 93.9% of the company's total profit. Memory product sales surged by 292% compared to the previous year.
Beyond these impressive figures, the key takeaway from the call was the unusual "profit margin inversion" between HBM and conventional DRAM, alongside significant customer anxiety over future production capacity.
Following the earnings release, Samsung's stock experienced volatility, closing down over 2%.
"Conventional DRAM is currently more profitable than HBM," emerged as a central point of discussion during the call. When analysts questioned whether shifting the product mix toward conventional DRAM could yield higher short-term profits, management provided a rare direct response: "It is true that the profit margin for conventional DRAM is indeed higher than for HBM."
Management elaborated on the structural reasons behind this profit inversion. HBM product pricing is pre-determined annually, whereas conventional DRAM prices are negotiated quarterly. With conventional DRAM prices rising substantially each quarter and HBM annual contract prices fixed, a margin disparity has emerged.
"According to industry practice, for HBM, considering the lead time required for backend capacity preparation, we pre-negotiate expected pricing on an annual basis. Conventional DRAM is priced quarterly, and as its price continues to rise significantly each quarter, this creates a profit margin inversion between HBM and conventional DRAM," management explained.
However, management clearly stated that Samsung would not significantly shift focus to conventional DRAM for short-term gains: "Concentrating our product mix on conventional DRAM solely for short-term performance could potentially constrain the construction of AI infrastructure itself—this is why we believe a supply balance between HBM and conventional DRAM is necessary."
A timeline was provided: the profit margin gap between HBM and conventional DRAM is expected to narrow significantly by 2027, with the proliferation of inference services and agentic AI.
The call also highlighted intense capacity concerns triggered by AI infrastructure build-out, with predictions that the supply-demand gap will "widen further" by 2027. Samsung described the current supply chain as "extremely tight," with allocation rates at historical lows. Unlike previous years, customers concerned about shortages are already bringing forward their 2027 demand. Based on bookings alone, the 2027 supply-demand gap is projected to be larger than this year's.
In response, hyperscalers are seeking mid- to long-term supply commitments. Samsung revealed it is pursuing multi-year supply agreements, which involve a higher level of binding commitment compared to previous arrangements based on mutual trust.
On the product front, Samsung is accelerating its high-end memory product iteration. The company announced it is the first globally to begin commercial shipments of HBM4 and has secured actual pricing premiums from customers for its 1c-nanometer process products. Prepared capacity for HBM4 is fully booked and sold out. Substantial expansion of HBM4 supply is expected in the second half of 2026, with its sales projected to exceed 50% of total HBM sales from Q3 onwards. Samples of next-generation HBM4e, offering bandwidth up to 4.0 TB/s, will begin in Q2.
Notably, AI's impact on memory is extending from DRAM to NAND. As large language models require higher data capacity, relying solely on expensive HBM and DRAM creates significant cost burdens. Citing Nvidia's recent GTC architecture that expands AI inference data storage to NAND-based solutions, Samsung emphasized growing demand for high-performance PCIe Gen 6 SSDs. Samsung stated its ultra-high-capacity and Gen 6 product lines are ready, aiming to secure leadership in the early Gen 6 market in the second half of the year.
In contrast to the semiconductor division's strong performance, Samsung's mobile experience (MX) division faces severe cost pressures due to soaring memory chip prices. Despite strong sales of the Galaxy S26 series, management acknowledged that cost pressures for key components are expected to intensify in Q2, making a decline in profitability seem inevitable. With the 2026 smartphone market shipment volume forecast to contract, Samsung will focus on super-premium products to sustain revenue growth.
Within the foundry business, Samsung is attempting to reduce its reliance on mobile. Management disclosed active negotiations with several major AI and HPC clients for 2-nanometer projects, expecting clearer outcomes with certain customers soon. Additionally, mass production for a major silicon photonics optical communication module manufacturer will begin in the second half of the year, marking a significant step into low-latency data center transmission technology.
Comments