A top Federal Reserve official has cast doubt on the popular narrative surrounding AI-driven productivity gains, challenging the logic for interest rate cuts promoted by the Trump administration and the Fed's incoming chair. On Friday, Chicago Federal Reserve Bank President Austan Goolsbee warned that the widespread expectation of AI-driven productivity gains could itself push interest rates higher. Should this technological revolution disappoint, the outcome could be even worse: stagflation.
Speaking at the Hoover Institution's annual monetary policy conference at Stanford University, Goolsbee stated, "The bigger the hype, the bigger the hazard." He cited Chicago Fed survey data indicating that economists, technology professionals, and the general public all expect to gain roughly an extra percentage point of productivity growth annually over the next decade. This widespread expectation, he argued, itself creates a risk of economic overheating.
This stance directly challenges the "AI-driven rate cut" narrative being pushed by incoming Fed Chair Warsh and the Trump administration. Warsh, who is reportedly expected to be confirmed by the Senate on Monday as the 17th Fed Chair, has previously stated that AI will unleash "the most productive wave in our lifetimes" and characterized it as a "structurally disinflationary" factor, implying the Fed would have more room to cut rates. U.S. Treasury Secretary Bessent holds a similar view, likening the current situation to "the budding stage of a productivity boom, not unlike the 1990s."
**Expectations Themselves Are the Risk**
Goolsbee's core argument is that the macroeconomic effect of productivity gains depends on whether they arrive as a "surprise" or are "already expected." He explained that when productivity improves more than expected, inflation falls, allowing interest rates to follow lower. However, when the market has already fully priced in the technology dividend—as the current AI enthusiasm is broadly reflected in financial markets and corporate balance sheets—households and businesses will rush to increase spending and investment *before* the productivity gains actually materialize. This "borrowing from the future" behavior will cause current economic overheating, thereby pushing interest rates *higher*.
Using the 1990s tech boom as an example, he noted that the Fed under then-Chairman Alan Greenspan actually raised rates six consecutive times between 1999 and 2000 precisely to counter the pressure from demand being pulled forward by expected productivity gains. Goolsbee said he found it "somewhat difficult to understand" the reasoning of Warsh and others who cite the 1990s analogy as a basis for cutting rates.
**If AI Fails, Stagflation Risks Emerge**
When pressed by former St. Louis Fed President James Bullard on what would happen if AI productivity expectations were not met, Goolsbee offered a more severe assessment. He stated that if the market continues to expect a boom and persistently borrows future consumption and investment, but the technological dividend ultimately fails to materialize, the economy could then fall into a recession against a backdrop of overheated demand and persistently high inflation. "You could easily get stagflation," he said. "This isn't a bubble; this is fundamentals."
Goolsbee also listed several leading indicators he is monitoring: the wealth effect from housing prices boosting consumer spending; the surge in data center construction driving up land and chip costs, with this spillover already affecting industries unrelated to AI; and changes in the number of workers leaving the labor force in anticipation of increased future wealth.
**Internal Divisions: Other Voices Question the Logic**
Goolsbee's judgment is not without dissent. At the same forum, Fed Governor Waller challenged his core premise. Waller stated that the wealth effect channel Goolsbee described has "existed in many models for a long time" but has "not consistently shown up" in the actual data. He added that incorporating real-world factors, such as households' difficulty in easily mortgaging future income or more gradual spending adjustments, would significantly weaken the effect.
Atlanta Fed visiting scholar Steven Davis raised concerns from another angle. Citing recent Atlanta Fed analysis, he noted that the average AI investment spending by firms is 14 times the median, indicating this investment boom is highly concentrated among a few companies and not widely diffused.
University of Chicago economist Luigi Zingales presented another perspective. A New York Fed survey shows a growing number of residents expect to lose their jobs due to AI, which could instead increase the savings rate rather than pull forward consumption. This points in the opposite direction of Goolsbee's concern. Goolsbee himself acknowledged that this dynamic could indeed lead to the opposite conclusion.
Comments