Nvidia vs. Everybody Else: Competition Mounts Against the Top AI Chip Company -- WSJ

Dow Jones11:00

By Robbie Whelan

For a decade, one company has maintained a near-total stranglehold on the business of selling the advanced computer chips that power machine learning and artificial intelligence: Nvidia.

Armed with the most advanced blueprints for graphics processing units, or GPUs, and helped by the rapid pace of innovation at Taiwan Semiconductor Manufacturing, the contract fabricator that makes 90% of the world's advanced AI chips, Nvidia has become synonymous with AI processors.

That's starting to change. New entrants to the AI chip-design business, including Google and Amazon, are talking about selling their most advanced chips, which rival Nvidia's GPUs in power and efficiency, to an array of outside customers.

Smaller rivals like Advanced Micro Devices, Qualcomm and Broadcom are introducing products that help them focus more intently on AI data-center computing. Even some of Nvidia's biggest customers, like ChatGPT-maker OpenAI and Meta Platforms, are beginning to design their own custom chips, presenting a fresh challenge to the company's ubiquity.

While it's unlikely that Nvidia will see a mass exodus of customers, efforts by AI firms to diversify their suppliers could make it harder for the market leader to generate the superlative sales growth investors have grown accustomed to seeing.

The landscape is shifting rapidly. Each passing week seems to come with a new, massive tech infrastructure deal or the release of a new generation of powerful AI chips. Here's a rundown of the major companies jostling for position in the fast-growing market for AI chips.

The Top Dog

Nvidia's dominance in AI computing power has made it the most valuable company in the world and propelled its leather-jacket-clad Chief Executive Jensen Huang to celebrity status. Investors parse Huang's every word and look to the company's quarterly earnings as a barometer of the overall AI boom.

Nvidia likes to describe its business as more than just chips, emphasizing that it offers "rack-scale server solutions" and calls the data centers that use them "AI factories." But the basic product that Nvidia offers -- accelerated computing -- is the same one that all AI firms want.

From February through October, Nvidia sold $147.8 billion worth of chips, network connections and other hardware underpinning the explosive growth of AI. That's up from $91 billion in the same period a year earlier.

In July, Nvidia surpassed $4 trillion in market value, the first company on the planet to do so. Five months later, it briefly topped $5 trillion, before fears of a bubble swept through the AI industry. Nvidia's share price, like that of most of its rivals, fell a bit closer to earth. Even with the correction, the company is worth more than twice its nearest competitor, Broadcom, which is valued at $1.8 trillion.

Nvidia had humble beginnings. In what is now the stuff of corporate legend, Huang, Curtis Priem and Chris Malachowsky -- three friends, all of them electrical engineers -- founded the company in 1993 over Grand Slam breakfast plates at a Denny's in San Jose, Calif.

Their original goal was to develop chips that could produce more realistic 3-D graphics for personal computers. Unlike the central processing units, or CPUs, that power most PCs, GPUs are capable of parallel computing: They can perform millions or billions of simple tasks simultaneously. Originally used by videogame developers, Nvidia's GPUs were perfect for deep learning and AI, the company later realized.

In 2006, Nvidia released CUDA, its proprietary software library that allows developers to build applications using the company's chips and make them run faster. As the AI gold rush took hold, thousands of developers became locked into Nvidia's ecosystem of hardware and software.

Nvidia has quickened its cadence for releasing each new generation of advanced AI chips. Late last year, it started shipping its Grace Blackwell series of servers -- its most powerful AI processors yet, employing its most-advanced chips -- and sold out almost instantly. At an October conference in Washington, D.C., Huang said the company had sold six million Blackwell chips so far in 2025 and had orders for 14 million more, in total representing half-a-trillion dollars in sales.

Challenges remain. Nvidia has been effectively banned from selling its chips in China for the last three years, a problem because Huang insists the rival superpower is home to half the world's AI developers. Without the billions in sales that Chinese customers represent, the company's growth will be constrained, and China's tech sector will likely become accustomed to working with homegrown chips instead.

Nvidia faces increased pressure at home now, too.

The Rival Designers

AMD made a critical change of course three years ago to set up a classic David vs. Goliath challenge to Nvidia.

As it became clear that demand for advanced AI processors was skyrocketing, AMD CEO Lisa Su told her board that she planned to reorient the entire company around AI. She predicted the "insatiable demand for compute" would continue. The wager has so far paid off handsomely: AMD's market cap has nearly quadrupled to more than $350 billion, and the company recently inked major deals to supply chips to OpenAI and Oracle.

Another chip designer, Broadcom, once a division of Hewlett-Packard, has also emerged as a formidable competitor. It expanded into a $1.8 trillion leviathan through a series of big-ticket mergers. Broadcom now produces custom chips called XPUs, which are designed for specific computing tasks, and networking hardware that helps data centers stitch together huge racks of servers.

Intel, one of the original Silicon Valley titans, has fallen on hard times. It mostly missed out on the AI revolution because of a series of strategic errors, but it recently invested heavily in both its design and manufacturing businesses and is courting customers for its advanced data-center processors.

Qualcomm, which is best known for designing chips for mobile devices and cars, saw its stock jump 20% after its October announcement that it would launch two new AI accelerator chips. The company said the new AI200 and AI250 are distinguished by their very high memory capabilities and energy efficiency.

The Giant Interlopers

In recent weeks, competition intensified. Armed with mountains of cash from other business lines, Alphabet's Google unit and Amazon's cloud-computing segment, Amazon Web Services, have invested in AI chips and are seeing increased demand for them from third-party customers as well.

For more than a decade, Google has designed and used chips known as tensor processing units, or TPUs, for internal use. The company first made them available for third-party use in 2018, but for several years they weren't sold as widely to large customers. Now, giants including Meta, Anthropic and Apple either buy access to TPUs to train and run their models, or are in talks to do so.

In late November, Dylan Patel, founder of influential AI infrastructure consulting firm SemiAnalysis, mused that the growing popularity of Google's chips might mean "the end of Nvidia's dominance."

Amazon, meanwhile, is expanding a data-center cluster for Anthropic that will eventually have more than one million of Amazon's Trainium chips, and AWS just launched broader sales of chips that it says are faster and use far less energy than Nvidia's equivalents.

The Do-It-Yourselfers

Even Nvidia's customers are starting to eat into its dominance by developing their own application-specific integrated circuits, or ASICs. This class of chips, co-designed by AI companies and the big silicon firms, are optimized for highly specific computing tasks.

OpenAI and Broadcom recently struck a multibillion-dollar partnership to develop custom chips to serve the ChatGPT-maker's computing needs. A few months ago, Meta announced it would acquire chip startup Rivos to boost its efforts to develop in-house AI training chips.

Microsoft's chief technology officer said in October that the company plans to rely more heavily on its own custom accelerator chips in its data-center business. And over the summer, Elon Musk's xAI posted job listings for chip designers to help with "designing and refining new hardware architectures" to assist in AI model training.

Most industry watchers say it's unlikely Nvidia will lose its dominant market position, and Nvidia argues that its computing systems are more flexible and have broader uses than custom chips. But with demand rising rapidly, it's no longer the only game in town.

Write to Robbie Whelan at robbie.whelan@wsj.com

 

(END) Dow Jones Newswires

December 05, 2025 22:00 ET (03:00 GMT)

Copyright (c) 2025 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment