The views expressed by contributors are their own and not the view of The Hill

‘Overly data-dependent’ — how the Fed and the markets keep getting it wrong

Federal Reserve officials have in recent months emphasized the fact that interest rate decisions will be increasingly data-dependent. Fed Chair Jerome Powell has repeatedly noted that policy changes will be determined by incoming data and avoided making explicit commitments.

Other major central banks have also jumped on the data-dependent monetary policy bandwagon. Earlier this year, European Central Bank President Christine Lagarde observed: “In current conditions, a robust strategy calls for a data-dependent approach to making policy and a clear reaction function so that the public understands the sources of information that will be important to us.”

Such pronouncements by central bankers lead forecasters to await every fresh batch of economic data eagerly. Bond market volatility has risen, as trading strategies are frequently revised based on expectations of how incoming data may influence central bank actions at upcoming meetings. To the general public, it may appear as if central bankers are undertaking ad-hoc monetary policymaking and essentially making it up as they go along.

Mohamed El-Erian captured the current state of affairs when he observed: “I think the major issue is that the market has become overly data-dependent, just like our central banks have become overly data-dependent. So, we’re not looking beyond the next data release because we’re worried about what will the Fed do in September, what will the ECB do in September.”

Central bankers and markets should be aware of the risks associated with an overly data-dependent approach to monetary policymaking. In 2019, Powell himself highlighted the basic challenge: “We must sort out in real time, as best we can, what the profound changes underway in the economy mean for issues such as the functioning of labor markets, the pace of productivity growth, and the forces driving inflation.”

Yet economic data are often unreliable. Official statistics undergo multiple and often substantial revisions. For instance, consider the widely watched non-farm payroll numbers. Initial estimates of monthly job gains/losses are widely reported by the media, and financial markets and the Fed often react sharply to this major data point. 

Less widely-appreciated is that the payroll numbers undergo subsequent revisions —sometimes large ones that give the lie to the initial perceptions. Annual benchmark revision of 2022 data, for instance, indicated more robust employment figures than previously reported. In contrast, this year’s early rounds of revisions have consistently been downward.

GDP data also undergo multiple revisions, which makes it hard to quickly ascertain overall macroeconomic conditions. For instance, the Bureau of Economic Analysis will soon release an annual comprehensive update of national accounts, revising GDP data going all the way back to 2013. Furthermore, when estimates of aggregate economic activity provided by GDP and GDI diverge sharply, it introduces an additional layer of uncertainty.

Richard Fisher, a former Dallas Fed president, once offered an example of the dangers involved. Data pointing to excessively low levels of inflation had prompted the Fed to keep rates low in 2002 and 2003. Subsequent revisions, he acknowledged, showed that “inflation had actually been a half point higher than first thought. In retrospect, the real fed funds rate turned out to be lower than what was deemed appropriate at the time and was held lower longer that it should have been.”

The result? Some have argued that the Fed’s loose monetary policy in the early 2000s drove the U.S. housing bubble whose collapse ultimately triggered the 2008 global financial crisis.

Another major dilemma associated with an overly data-dependent approach to monetary policymaking is that three of most important parameters used by central bankers to determine the long-run destination of the economy (r-star, u-star and potential GDP) are unobservable and time-varying. New York Fed researchers, for instance, have highlighted post-pandemic swings in both the short-run and long-run r-star.

Structural shifts involving labor’s bargaining power, globalization, technology, and demographics can also play havoc with fundamental parameter values that are crucial for setting policy.

Central bankers also have to deal with the fact that monetary policy affects the real-economy with a lag of uncertain duration. Intuitively, a reactive Fed that is too focused on backward-looking data will be prone to policy errors, since today’s rate changes will affect the economy only with some lag.

The Fed’s poor forecasting track record does not inspire much confidence in its ability to undertake effective forward-looking monetary policy actions.

Monetary authorities should be more aware of the margin of uncertainty arising from real-world data limitations. Broadening the inflation target goal (adopting a target range of, say, 1 percent to 3 percent instead of the current narrow point target of 2 percent) would offer some flexibility and take into account inherent problems associated with imperfect data. Additionally, an explicit focus on financial stability will contribute to overall economic health and potentially ease market volatility.

Vivekanand Jayakumar is an associate professor of economics at the University of Tampa.

Tags Christine Lagarde federal reserve bank Jerome Powell Mohamed El-Erian

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Top ↴
Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more