AMD Seeks Strategic Alliance with Samsung to Secure Critical HBM Supply for AI Chips

Stock News03-11

According to media reports citing informed sources, Advanced Micro Devices (AMD.US) CEO Lisa Su is scheduled to meet with Samsung Electronics Co., Ltd. Chairman Lee Jae-yong in South Korea next week. The primary focus of their discussions will be on establishing a robust cooperation to secure the supply of High Bandwidth Memory (HBM), a critical component for artificial intelligence (AI) chips. For AMD, which is striving to capture market share in the trillion-dollar AI chip sector currently dominated by Nvidia (NVDA.US) with approximately 90% share, such a partnership may not immediately challenge Nvidia's CUDA ecosystem moat. However, it offers a more immediate and practical advantage in the current environment where demand for memory chips far outstrips supply: supply chain certainty.

South Korean media reported on Wednesday, citing unnamed industry sources, that Lisa Su will arrive in South Korea on March 18 and plans to meet with key partners, including Lee Jae-yong and Naver CEO Choi Soo-yeon. Naver confirmed that a meeting between CEO Choi and AMD management is scheduled but declined to disclose the specific agenda. Samsung Electronics declined to comment.

The meeting between Su and Lee is highly anticipated against the backdrop of exploding demand for memory chips, including HBM, DRAM, and NAND. AMD, Nvidia, and other major tech companies are racing to build hyperscale AI data centers to provide the immense computational resources required for AI training and inference systems, all of which rely heavily on these crucial memory chips. Whether it's Google's massive TPU AI compute clusters or vast Nvidia AI GPU clusters, they all depend on HBM systems fully integrated with AI chips. Beyond HBM, tech giants like Google and OpenAI are also accelerating the construction or expansion of AI data centers, necessitating large-scale purchases of server-grade DDR5 memory and enterprise-level high-performance SSD/HDD storage solutions.

Unlike Seagate and Western Digital, which focus on nearline high-capacity HDDs, or SanDisk, which concentrates on high-performance eSSDs, the three major memory chip manufacturers—Samsung Electronics, SK Hynix, and Micron—hold key positions across multiple core memory segments: HBM, server DRAM (including DDR5/LPDDR5X), and high-end datacenter enterprise SSDs (eSSD). They are the most direct beneficiaries of the "AI memory and storage stack," collectively reaping the "super红利" (super dividends) from AI infrastructure build-out.

Informed sources indicated that Lisa Su is also expected to discuss broader prospects for AI compute infrastructure cooperation with Naver, South Korea's largest internet portal and search engine provider. South Korean media reported that the key areas of discussion between Su and Samsung's leader Lee will include expanding the supply of data center semiconductors, promoting the development of sovereign AI infrastructure built around AMD GPUs by Samsung, and fostering active collaboration on next-generation computing technologies and advanced process chip manufacturing.

Her visit to Samsung is expected to coincide with the week of Nvidia's annual GTC developer conference. This highly anticipated technology event, hosted by Nvidia, will take place from March 16 to 19 in San Jose, California.

What does this "potential alliance" between AMD and Samsung Electronics mean for each party? In the current climate of memory chip shortages, this potential partnership is crucial for AMD to ensure the timely delivery of its complete AI systems. AMD has already begun to elevate its competitive focus to the level of entire rack-scale systems, similar to Nvidia's NVL72. Its Helios AI server cluster, slated for 2026, is essentially a direct competitor to Nvidia's "rack-scale AI infrastructure." At the CES (Consumer Electronics Show) in early 2026, a focal point for global technology investors, AMD is expected to launch its MI440X and MI455X AI GPUs for large enterprise data centers and showcase its high-end Helios rack-scale system. AMD has even previewed that its MI500 series AI GPUs, due in 2027, will deliver performance 1000 times that of its 2023 flagship products, aiming to further break Nvidia's dominance in the AI compute infrastructure market and compete for orders worth tens or even hundreds of billions of dollars.

For rack-scale AI compute products from AMD and Nvidia, HBM is not just ordinary memory; it is a core component that determines memory capacity, bandwidth, power consumption limits, and the overall system deployment timeline. By deepening long-term cooperation with Samsung, AMD would essentially be securing a more advanced HBM roadmap and supply certainty for its Helios rack-scale compute platform beyond the MI350. The ability of AMD's Helios to achieve true large-scale deployment largely depends on the continuous, stable, and timely availability of HBM.

For Samsung Electronics, the strategic value of partnering with AMD could be even greater. AMD represents one of Samsung's best potential "breakthrough clients" to challenge SK Hynix's dominance in the HBM market. According to Macquarie statistics, SK Hynix held a 61% share of the HBM market in 2025, while Samsung trailed significantly at 19%. Meanwhile, although Samsung has begun shipping HBM4 to customers and claims its HBM4 performance is 22% higher than HBM3E, the market still largely views it as a "follower to SK Hynix." In this context of striving to catch up with SK Hynix's HBM market share, AMD's value extends beyond mere orders; it provides a critical validation scenario. AMD has already adopted Samsung's HBM3E for its MI350. If it continues to involve Samsung more deeply in subsequent product collaborations, Samsung can leverage AMD's high-performance AI GPU platform to demonstrate its yield rates, packaging integration capabilities, and mass-production stability under real-world, massive data center workloads.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment