On January 20, Hugging Face, the world's largest AI open-source community, published an in-depth article titled "One Year After the 'DeepSeek Moment'," detailing how China's AI forces have reshaped the global open-source ecosystem over the past year.
The article points out that the release of DeepSeek R-1 in January 2025 became a significant watershed moment for the industry; it lowered technical and application barriers, serving not only as a turning point for China's AI development but also triggering a profound transformation of the open-source model globally, prompting a comprehensive rise in the download volume and influence of Chinese models.
Over the past year, giants like Baidu, Alibaba, and Tencent, along with startups like Moonshot, have substantially increased their open-source investments, leading to the download volume of Chinese models on Hugging Face surpassing that of the United States. Although the West is seeking alternatives, numerous global startups and researchers are increasingly relying on open-source models developed in China as a foundation, indicating that Chinese AI has become deeply embedded in the global supply chain.
The "DeepSeek Moment": Breaking Three Barriers As the most crucial collaboration platform for global AI developers, Hugging Face's article first revisits the market conditions prior to the release of DeepSeek R-1. The article describes, "Before R1, China's AI industry was primarily focused on closed-source models... for most companies, open-source was not the default choice." However, the emergence of R-1 changed this landscape.
The article's author argues that the true significance of R-1 lies not in whether it was the most powerful model at the time, but in how it lowered three critical barriers. First is the technical barrier; R-1 transformed advanced reasoning capabilities into downloadable, fine-tunable engineering assets, making "reasoning begin to behave like a reusable module." Second is the adoption barrier; the MIT license allowed the model to quickly enter production environments, shifting community discussions from "which model scores higher" to "how do we deploy it." Most importantly, it eliminated the psychological barrier. The article states, "When the question shifted from 'Can we do this?' to 'How do we do this well?', the decisions of many companies changed." Hugging Face evaluates this release as having "given China's AI development something extremely precious: time," proving that even with limited resources, rapid progress can still be achieved through open source and fast iteration.
The release of DeepSeek R-1 made "open source" no longer merely a tactical choice but a long-term strategy for Chinese tech companies. The article emphasizes that over the past year, China's AI development model has undergone a fundamental shift, rapidly transitioning from an early focus on closed-source to being dominated by open source. Giants Enter the Arena and Strategic Reconfiguration As open source entered the mainstream, the strategies of Chinese tech companies changed significantly. Hugging Face's article points out that compared to 2024, the period following R-1's release witnessed a new pattern in China's AI landscape: "Large tech companies took the lead, startups followed closely, and companies in vertical industries also increasingly entered this field."
Giants like Baidu, ByteDance, and Tencent, along with startups like Moonshot, have entered the fray, resulting in the top-ranked models on Hugging Face no longer being monopolized by US developers. The article cites data to support this trend: Baidu's releases on Hugging Face increased from zero in 2024 to over 100 in 2025; releases from ByteDance and Tencent increased eight to ninefold. Furthermore, Moonshot's release of Kimi K2 is regarded as "another DeepSeek moment." The article analyzes that the focus of competition has shifted from pure model performance to the ecosystem. Using examples like Zhipu AI's GLM and Alibaba's Qwen, the article notes these companies not only release model weights but also build engineering systems and ecosystem interfaces. It states bluntly, "At this stage, winning is no longer just about comparing raw model performance. Competition is increasingly focused on ecosystems, application scenarios, and infrastructure."
"Coordination Under Constraints" and Market Dominance Hugging Face's article further proposes a compelling viewpoint: the collective rise of Chinese AI players stems not from coordinated agreements, but from shared constraints. The article writes, "What looks like cooperation is best understood as alignment under shared technical, economic, and regulatory pressures." Under the common pressures of limited computing power and cost control, companies began competing on similar technical foundations and engineering paths. This isomorphism gives the ecosystem the ability to self-replicate and expand.
This strategy has yielded significant results in market data. The article discloses, "Among new models (<1 year old), the download volume of Chinese models has surpassed that of any other country, including the United States." Hugging Face's heatmap data also shows that from February to July 2025, the open-source releases from Chinese companies became noticeably more active.
For the market, this signifies that China's AI industry has evolved from a simple competition over model parameters to a contest of system-level engineering capabilities with greater potential for commercial implementation. Global Reaction: A Mix of Dependence and Catch-Up The article concludes by analyzing the global market's reaction to the rise of Chinese AI. Although institutions in the US and France (like Mistral) are accelerating the release of open-source models to maintain competitiveness, the influence of Chinese models has penetrated to the foundational level.
Hugging Face reveals a key fact: "Globally, startups and researchers using open-weight models often default to, or even depend on, models developed in China." The article illustrates this by pointing out that Deep Cogito v2.1, a leading US open-weight model released in November 2025, is actually a fine-tuned version of DeepSeek-V3.
Simultaneously, the article mentions that DeepSeek has been widely adopted in global markets like Southeast Asia and Africa, with its multilingual support and cost advantages being key factors for enterprise use. Facing this situation, the US launched the ATOM (American Truly Open Model) project, which explicitly cites the momentum of DeepSeek and Chinese models as motivation, calling for a coordinated effort in open-weight model development. The article concludes, "The world is still reacting, kicking off a new wave of open-source fervor." Looking ahead to 2026, Hugging Face anticipates more major releases from both China and the US, with architectural trends and hardware choices becoming the focus of the next phase.
Comments