Chicago Fed President Austan Goolsbee has cast doubt on the prevailing narrative around AI-driven productivity gains, challenging the logic for interest rate cuts promoted by the Trump administration and the incoming Federal Reserve Chair.
Speaking at the Hoover Institution's annual monetary policy conference at Stanford University on Friday, Goolsbee warned that the widespread expectation of a productivity boom from artificial intelligence could itself push interest rates higher. Should this technological revolution disappoint, the outcome could be even worse: stagflation.
"The bigger the hype, the bigger the downside risk," Goolsbee stated.
Citing Chicago Fed survey data, he noted that economists, technology professionals, and the general public all expect to gain approximately an extra percentage point of productivity growth per year over the next decade.
This widespread expectation, however, poses a risk of economic overheating. His remarks directly challenge the "AI-driven rate cuts" narrative being advanced by incoming Fed Chair nominee John C. Warsh and the Trump administration.
Warsh, who is expected to be confirmed by the Senate as the 17th Fed Chair on Monday, has previously stated that AI will usher in "the most productive wave in our lifetimes" and characterized it as a "structurally disinflationary" factor, suggesting it would give the Fed more room to cut rates.
U.S. Treasury Secretary Betsy DeVos holds a similar view, comparing the current situation to "the budding stage of a productivity boom, not unlike the 1990s."
**Expectations Themselves Are the Risk**
Goolsbee's core argument is that the macroeconomic impact of productivity gains depends on whether they arrive as a "surprise" or are "already anticipated."
He explained that when productivity improves more than expected, inflation falls, allowing interest rates to move lower. However, when the market has already fully priced in the technological dividend, as is the case with the current AI enthusiasm widely reflected in financial markets and corporate balance sheets, households and businesses may rush to increase spending and investment *before* the productivity gains materialize.
This "borrowing from the future" behavior would cause current economic overheating, thereby pushing interest rates higher.
Using the 1990s tech boom as an example, he pointed out that the Federal Reserve under then-Chairman Alan Greenspan actually raised interest rates six consecutive times between 1999 and 2000, precisely to counter the pressure from demand being pulled forward by anticipated productivity gains.
Goolsbee said he finds it "somewhat difficult to understand" the reasoning of Warsh and others who cite the 1990s analogy as a basis for cutting rates.
**If AI Fails, Stagflation Risk Emerges**
When pressed by former St. Louis Fed President James Bullard on what would happen if AI productivity expectations fail to materialize, Goolsbee offered a more severe assessment.
He stated that if the market continues to expect a boom, persistently over-consuming and over-investing, and the technological dividend ultimately fails to arrive, the economy would then enter a recession against a backdrop of overheated demand and persistently high inflation.
"You could easily get stagflation. This is not a bubble; this is fundamentals," he said.
Goolsbee also listed several leading indicators he is monitoring: - The wealth effect from housing prices boosting consumer spending. - The surge in data center construction driving up land and chip costs, with this spillover already affecting industries unrelated to AI. - Changes in the number of workers exiting the labor market in anticipation of increased future wealth.
**Internal Divergence: Other Voices Challenge This Logic**
Goolsbee's assessment is not without dissent. At the same forum, Fed Governor Christopher Waller challenged his core argument.
Waller stated that the wealth effect channel described by Goolsbee "has been in a lot of models for a long time" but "has not consistently shown up in the data." He added that if real-world factors—such as households' difficulty in easily mortgaging future income or more gradual spending adjustments—are incorporated into models, the effect is significantly diminished.
Atlanta Fed visiting scholar Steven Davis raised concerns from another angle. Citing recent Atlanta Fed analysis, he noted that the average AI investment spending by firms is 14 times the median, indicating this investment boom is highly concentrated among a few companies, not widely diffused.
University of Chicago economist Luigi Zingales presented another perspective. A New York Fed survey shows a growing number of residents expect to lose their jobs due to AI, which could instead increase the savings rate rather than pull forward consumption.
This points in the opposite direction of Goolsbee's concern. Goolsbee himself acknowledged that this dynamic could indeed lead to the opposite conclusion.
Comments