By Greg Ip
It was only a matter of time before the AI apocalypse theory went mainstream. Last weekend, a sensational report posited a future in which AI unleashes enough disruption and job destruction to bring on a deep recession and financial crisis. In response, the entire stock market sold off.
AI disruption makes news almost daily. On Thursday, payments company Block said it was laying off 4,000 employees, 40% of its workforce, because AI has "changed what it means to build and run a company," founder Jack Dorsey told shareholders. "Within the next year, I believe the majority of companies will reach the same conclusion."
Is this just the beginning? No one should dismiss any scenario, even the most dystopian, with high conviction. Certainly not journalists, whose way of life is in AI's crosshairs.
But I keep stumbling over one small problem with the doomsday vision: It requires a breakdown in how the market economy functions. Nothing like it has happened in the U.S. before, and there is no evidence it is happening now.
The thesis
Technology enables us to produce more or better products with less hours of work. Over time, this makes us richer. It's why we produce many times more food with far fewer farmers than 150 years ago and our factories crank out more products with a smaller workforce than in 1979.
Technological advancements always cost some people their jobs -- those whose skills can be easily substituted by tech. But their loss is more than offset through three other channels. The new technology enhances the skills of some survivors, who become more productive and better paid; it helps create new businesses and new jobs; and it makes some stuff cheaper, increasing consumers' incomes, adjusted for inflation, which can be spent on other stuff, generating yet more jobs.
These offsets explain why, through the sweep of U.S. history, technological advance hasn't, by itself, raised unemployment for the country as a whole.
The AI doomers claim this time is different. AI is happening faster and does far more than past technological revolutions. It could one day exceed human intelligence. "AI isn't replacing one specific skill. It's a general substitute for cognitive work...Whatever you retrain for, it's improving at that too," AI investor Matt Shumer wrote in a viral X post two weeks ago.
Citrini Research, in its fictional dispatch from 2028 that rocked the markets Monday, wrote: "It should have been clear all along that a single GPU cluster in North Dakota generating the output previously attributed to 10,000 white-collar workers in Midtown Manhattan is more economic pandemic than economic panacea...The human-centric consumer economy, 70% of GDP at the time, withered."
The evidence
If such a revolution were upon us, we should see some sign of it. We don't, at least not yet. The ranks of software developers, widely assumed to be acutely vulnerable to AI, are up 5% in January from a year earlier, a pace largely consistent with the past 23 years. That's according to Labor Department data analyzed by James Bessen, executive director of the Technology and Policy Research Initiative at Boston University.
The number of computer programmers, who assist developers in ensuring code runs properly, was down slightly in the last year, in line with a secular decline in place for decades. Neither trend shifted much after ChatGPT's arrival in late 2022. Competition from AI isn't forcing computer scientists to take pay cuts, either. In 2024, the median young computer science graduate earned 63% more than the typical young graduate, up from 47% in 2009, data from Connor O'Brien at the Institute for Progress shows.
Meanwhile, business spending on software leapt 11% in the fourth quarter of last year from a year earlier, the fastest in nearly three years. Bessen sees this as evidence that software demand is elastic, meaning as the price per unit of performance falls, demand rises more.
This, Bessen notes, is in line with previous technological advances that drive prices down and demand up enough to offset direct job displacement. His examples include textile manufacturing in the 19th century, and the spread of ATMs in the 1980s.
My favorite example: As the number of bookkeepers shrank with the introduction of spreadsheet software in the early 1980s, the number of accountants and financial analysts newly empowered by Lotus 1-2-3 and Excel rose even more.
A study by Erik Brynjolfsson of Stanford University and two co-authors has found early signs of an AI impact: employment of 22- to 25-year-olds in the most AI-exposed occupations such as software developers and customer service agents fell 6% in the three years after the introduction of ChatGPT while that of older workers and workers in unexposed occupations rose.
But some critics say the drop could be explained by other factors, such as rising interest rates, that predate ChatGPT. Job postings for software developers jumped in the wake of the pandemic, then started to fall in early 2022, according to Indeed Hiring Lab.
Perhaps the advanced AI tools only now coming to market will change behavior in a way their predecessors didn't. The doomsday scenario envisions businesses ditching legacy systems and consumers turning over many of their tasks to AI "agents" almost overnight.
In reality, businesses are risk-averse and consumers creatures of habit. Radiologists were supposed to lose their jobs to offshoring, and then to AI. They didn't, because patients and providers like having humans around to explain their medical images. Since Google Translate launched in 2006, the number of human translator and interpreter employees in the U.S. has risen 73%.
Assume, though, that AI does destroy more jobs than it creates. Could the spillovers sink the entire economy? Almost certainly not. The money employers or consumers save as AI eliminates jobs doesn't disappear; it gets spent on something else. This is why a sector can be in recession while the overall economy grows.
China's entry into the World Trade Organization in 2001 cost the U.S. hundreds of thousands of manufacturing jobs in the following years. Oil and gas production jobs fell by a quarter after oil prices collapsed in 2014. And amid a spasm of bricks-and-mortar bankruptcies driven in part by e-commerce, retail employment fell by a quarter-million between 2017 and late 2019. In all three episodes, overall employment grew.
The real risk
Imagine a recession starts for some other reason. Employers could respond with AI-driven job cuts they were contemplating anyway, deepening the downturn.
Another possibility: Tech investment gets ahead of demand, precipitating a bust. Tech workers lost jobs in droves after 2001, not because the internet had made them obsolete, but because the internet-stock bubble had burst.
Today, the sums being plowed into data centers far exceed the revenue AI is currently generating. A bust that brings down the economy isn't my baseline. But at least it has a precedent, unlike the AI apocalypse that preoccupies folks now.
Write to Greg Ip at greg.ip@wsj.com
(END) Dow Jones Newswires
February 27, 2026 05:30 ET (10:30 GMT)
Copyright (c) 2026 Dow Jones & Company, Inc.
Comments