By Sean McLain and Sebastian Herrera
Sell the thing everyone wants, and charge less for it: For Amazon.com, that's been a winning formula for many years.
Amazon's new artificial intelligence czar, Peter DeSantis, thinks the company can use it in the AI race, where it needs a win after falling far behind competitors in developing cutting-edge models and popular applications for consumers and businesses.
"AI has a cost problem," he said. "If we ultimately want AI to transform everything, the costs have to be different."
Amazon, he said, has the expertise and the infrastructure to achieve that. Under DeSantis, Amazon plans to use its in-house chips to develop AI models more cheaply than competitors. It's betting on strong demand for enterprise AI products that make up for their lack of all-purpose power with task-specific customization. And it's still hoping that the large language model technology that powers chatbots can supercharge its Alexa-branded smart home offerings.
In a fiercely competitive -- and expensive -- race for AI superiority, OpenAI, Anthropic and Google are considered by some observers to be closing in on humanlike artificial intelligence. In creating its AI unit in 2023, Amazon signaled similar ambitions, choosing "Amazon AGI" as its name in a nod to the concept of artificial general intelligence. Amazon's flagship Nova model has lagged behind others in capability, according to independent benchmarking firms.
In December, Amazon said its chief AI scientist, Rohit Prasad, would leave the company and his responsibilities would be handed over to DeSantis, a widely respected nearly 28-year veteran who spearheaded cloud computing and silicon chip-making operations, among other accomplishments.
AI is hitting the mainstream: In a recent National Bureau of Economic Research survey of 6,000 executives, 69% said their companies use it in some form. Amazon aims to focus on building AI products that are useful to business customers as more CEOs push workers to adopt these tools.
In his first interview since starting the new job, DeSantis said Nova is at a level where he can speed up its development and make it cheaper. That will convince companies to go with Amazon for AI because the current high cost of chips and training AI have many business customers feeling the price often outweighs the benefits.
A low-cost approach worked for batteries and surge protectors. Why not Amazon Basics for AI?
Inside Amazon, DeSantis is seen as a larger-than-life figure, a product of his reputation for technical acumen and his status as one of the last remaining members of the company's old guard, said people who have worked with him.
He has been with Amazon since 1998, four years after Jeff Bezos founded Amazon in his Bellevue, Wash., garage. He is credited as one of the leaders who helped launch the major infrastructure that underpins the behemoth network of Amazon Web Services data centers.
Earlier in his Amazon tenure, DeSantis would at times get into shouting matches in meetings with senior executives over the direction of the cloud business, according to people who witnessed the confrontations.
Such heated disagreements historically have been considered healthy at the company, where "Have Backbone" and "Disagree and Commit" are among the company's leadership principles.
In announcing the leadership change, Jassy referred to DeSantis's "track record of solving problems at the edge of what's technically possible."
"I enjoy thinking big about what we can do, and then forcing myself and the team to iterate on balancing like the biggest thing we can think of with what we know we can achieve, and finding that happy middle point," DeSantis said. "The art is to tiptoe up to the edge of what's possible."
He has his work cut out for him.
Amazon had a head start in AI with the release of the Echo smart speaker with Alexa in 2014.
Still, when ChatGPT arrived in late 2022, Amazon was caught flat-footed. Prasad, the former AI leader, held several emergency meetings with staff to formulate a plan to compete, people familiar with the matter said.
Google and Microsoft, Amazon's competitors in the cloud-computing and enterprise software businesses, had been investing billions in large language model development for years -- Google in its own models, and Microsoft through its partnership with OpenAI.
"Amazon was slower to realize the importance of generative AI," said Lloyd Walmsley, a senior analyst at Mizuho covering the tech industry.
Amazon in 2023 appointed Prasad, an executive who was instrumental in the creation of Alexa, to oversee the AGI organization. It rolled out the first version of Nova in December 2024.
But his group, which largely consisted of several thousand employees moved over from the Alexa organization, struggled to create breakthroughs in AI models and meet internal deadlines, The Wall Street Journal previously reported.
The company has released several AI-powered products, including its Rufus shopping chatbot on Amazon.com, Amazon Q workplace assistant, and most notably, its revamped Alexa+ assistant. Some are powered by its Nova models, though the most technical requests for Alexa, for example, are handled by Anthropic's models. An Amazon spokeswoman said more than 70% of Alexa queries are handled by variants of the company's Nova model. Over 300 million people used Rufus in 2025, the company has said.
Amazon's latest AI model, Nova 2, performs better against competitors, the company says.
The war for AI talent is fierce. Amazon's base pay on average for software engineers and research scientists is less than Meta, OpenAI, Apple and Anthropic, according to market-research firm Levels.fyi. Amazon is also contending with the aftermath of two rounds of job cuts in six months, resulting in around 30,000 white-collar workers being laid off.
Over the years, Amazon has lost some top AI personnel as the market for talent has heated up. On Tuesday, the head of Amazon's AGI Lab, David Luan, announced he was leaving the company. The AGI Lab, which works on AI agents, will continue to operate and report to DeSantis, an Amazon spokeswoman said.
DeSantis said he was confident in Amazon's current staff and believed the company can continue to attract top talent.
He laid out a strategy that is less about shipping new bleeding-edge AI models every few months, as OpenAI and Anthropic have been doing, than about giving customers more cost-effective ways to meet their AI needs while keeping their technology up-to-date. Splashy releases, he said, are "kind of how you stay in the news, how you get to headlines but possibly doesn't really move the needle for any one particular use case."
An Amazon spokeswoman said the company aimed to drive down the cost of developing AI models and that Nova combined state-of-the-art technology with industry-leading cost efficiency.
The strategy starts with Amazon's homegrown AI chips, called Trainium and Inferentia, which are specialized, respectively, for training models and querying them for results. Amazon said its chips cost less because they are purpose-built for specific tasks and are up to 50% cheaper than comparable offerings from competitors.
"If we can build our models on our chips, we can build them at a fraction of the cost of a pure-play AI model provider," DeSantis said.
Enterprise customers that require specialized capabilities can build their own, customized gen-AI models through Amazon's Nova Forge product rather than pay for premium versions of ChatGPT, Claude or Gemini, Amazon says.
Some enterprises are increasingly drawn to more specialized models like Nova because they are cheaper, faster and can be tailored to specific workloads for industry-specific tasks such as automating threat detection in cybersecurity, said Tim Crawford, a tech consultant and longtime chief information officer who counts Amazon, Google and IBM as clients.
Some CIOs are asking " 'what's the outcome and what's the price I'm paying for that?' Because it's more about value," he said.
Boston-based drug-discovery firm Nimbus Therapeutics has been experimenting with Nova to help the company discover molecules that could become future medicines. The company tested a number of AI models before settling on Amazon's, in part because it was easier to train, cheaper than competitors and returned results quickly, said Leela Dodda, the company's director of computational chemistry.
"One thing that really struck us was how cheap Nova was," Dodda said. In the company's tests, the results were as accurate as a version of Anthropic's Claude at one-tenth the price, he said.
The amount Amazon itself spends on AI has come into sharp focus after the company said it would spend $200 billion on capital outlays this year, roughly what the company spent in the two preceding years, mostly on AI infrastructure.
The acceleration in spending has alarmed some investors who worry that Amazon is shipping money out more quickly than it can bring in customers to pay for AI-related services, causing the company to start burning cash. Analysts predict Amazon will burn around $9 billion in the first quarter alone.
Amazon's share price has declined around 8% since January as investors questioned whether the spending was resulting in enough new business for the tech giant.
DeSantis said he has heard those concerns before.
In Amazon's early days, critics thought it would be too expensive for the company to compete with big-box stores, he said. The same thing happened when Amazon started investing in data centers for AWS.
'" It's going to sink Amazon, it's not going to work, "' DeSantis recalled analysts saying. "I don't think anybody's thinking either of those two things anymore."
Write to Sean McLain at sean.mclain@wsj.com and Sebastian Herrera at sebastian.herrera@wsj.com
(END) Dow Jones Newswires
February 27, 2026 05:30 ET (10:30 GMT)
Copyright (c) 2026 Dow Jones & Company, Inc.
Comments