On November 17, 2025, Intel senior packaging engineer Shripad Gokhale demonstrated Xeon server chips to CNBC reporter Katie Tarasov at Intel's advanced packaging facility in Chandler, Arizona.
Over the past month, chip stocks have experienced a significant surge. Micron Technology rose 80%, Western Digital gained 52%, and Intel soared 85%, representing just a portion of the companies in this rally.
The core driver behind this chip stock surge is the evolution of artificial intelligence system architecture towards an "Orchestration" model. AI computing workloads are no longer concentrated in a few large, centralized chip clusters but are instead distributed and scheduled across multiple parallel processing channels.
Under this new architecture, the market demands more traditional central processing units (CPUs) compared to powerful graphics processing units (GPUs). It was precisely the surge in GPU demand during the first phase of AI development that fueled NVIDIA's significant stock price increase.
Although GPUs remain indispensable for core AI tasks such as model training and intelligent Q&A, Wall Street believes that as AI evolves towards Agentic AI—where AI can better understand and execute generalized instructions—the chip demand structure will continue to shift towards the orchestration model.
**Institutional Perspective: Agentic AI Reshapes CPU-to-GPU Ratio**
Morgan Stanley analyst Shawn Kim and his team stated in a Monday investor research report: "Agentic AI will increase workloads related to system orchestration, memory scheduling, and tool invocation, raising the configuration ratio of CPUs relative to GPUs in AI systems. This does not diminish GPU demand but increases overall system complexity, while redirecting new infrastructure spending more towards CPUs, networking equipment, and memory chips."
**Industry Buzzword: Orchestration**
Tech giants are optimistic about the orchestration architecture, emphasizing that optimizing internal coordination and adaptation within infrastructure—rather than simply upgrading chip architecture—is key to enhancing AI computing power.
Meta Platforms, Inc. stated in an April announcement that it would lease tens of millions of Graviton custom CPUs from Amazon Web Services: "No single chip architecture can efficiently handle all computing workloads. As Agentic AI development advances, computing demand is shifting towards more CPU resources."
Chipmaker Advanced Micro Devices (AMD) also partnered with Meta in February, incorporating CPUs into its core orchestration strategy. According to reports, this $60 billion cooperation agreement stipulates that Meta will purchase $6 billion worth of AMD chips over the next five years and can acquire up to a 10% stake in AMD.
AMD stated in a declaration: "As AI infrastructure scale and complexity continue to expand, the CPU has become a strategic pillar of the AI computing stack, working in concert with GPUs to achieve efficient computing operations, elastic scaling, and intelligent orchestration."
**Cybersecurity Field Validates Orchestration Value**
The orchestration model has been proven to enhance AI capabilities at a lower cost.
Last month, Anthropic's release of the Mythos large language model caused a stir in the cybersecurity field, after which the company restricted open access to the model. However, several research institutions have replicated similar capabilities by orchestrating and integrating multiple open-source mid-tier models.
Researchers from the Vdoc Security Lab stated: "We used open-source GPT-5.4 and Claude Opus 4.6 models, combined with a standardized chunked security audit process, to replicate and optimize the effects of their public cases without relying on Anthropic's internal technology stack, resulting in greater practicality."
The research team pointed out: "The focus is not on whether Mythos is more powerful, but that existing open-source models can already achieve comparable capabilities through orchestration."
Cybersecurity firm Aisle also adopted a similar approach: by orchestrating multiple small, low-cost models to run collaboratively, it successfully identified similar security vulnerabilities.
The institution stated: "Ordinary small and medium-sized models, with the help of a professional orchestration framework, can produce research results highly valued by the industry."
An industry consultant told CNBC bluntly: The market's equation of AI computing power entirely with GPUs is itself a cognitive misconception.
David Linthicum, former Chief Cloud Officer at Deloitte, said: "The market commonly misunderstands that doing AI necessarily requires GPUs, which is not the case. This perception may stem from NVIDIA's marketing efforts. When training architects, I have always emphasized prioritizing the simplest viable technical solution, using CPUs as much as possible to replace redundant GPU computing power."
**Other Beneficial Sectors**
The transition to an orchestration architecture is also bringing dividends to other parts of the data center supply chain, particularly the intermediate supporting fields that connect various processing channels:
This includes electronic design automation, substrate management control, chip substrate materials, as well as storage systems like DRAM memory and NAND flash.
Morgan Stanley's Agentic AI research report on May 11 listed downstream beneficiaries including: KLA Corporation, Cadence Design Systems, Taiwan's Circuit Electronics, as well as global leading memory manufacturers such as Samsung, SK Hynix, Micron Technology, Western Digital, and Kioxia.
Comments