Finance and Economics Discussion Series (FEDS)
Staff working papers in the Finance and Economics Discussion Series (FEDS) investigate a broad range of issues in economics and finance, with a focus on the U.S. economy and domestic financial markets.
Using a new consumer survey dataset, we document a new dimension of heterogeneity in inflation expectations that has implications for consumption and saving decisions as well as monetary policy transmission. We show that German households with the same inflation expectations differently assess whether the level of expected inflation and of nominal interest rates is appropriate or too high/too low. The `hidden heterogeneity' in expectations stemming from these opinions is related to demographic characteristics and affects current and planned spending in addition to the Euler equation effect of the perceived real interest rate. Furthermore, these differences in opinions affect German households differently depending on whether they are renters or homeowners.
A Crisis of Missed Opportunities? Foreclosure Costs and Mortgage Modification During the Great Recession
We investigate the impact of Great Recession policies in California that substantially increased lender pecuniary and time costs of foreclosure. We estimate that the California Foreclosure Prevention Laws (CFPLs) prevented 250,000 California foreclosures (a 20% reduction) and created $300 billion in housing wealth. The CFPLs boosted mortgage modifications and reduced borrower transitions into default. They also mitigated foreclosure externalities via increased maintenance spending on homes that entered foreclosure. The CFPLs had minimal adverse side effects on the availability of mortgage credit for new borrowers. Altogether, findings suggest that policy interventions that keep borrowers in their homes may be broadly beneficial during times of widespread housing distress.
Banks add value by monitoring borrowers. High funding costs make banks reluctant to lend. A central bank can ease funding by purchasing loans, but cannot distinguish which loans require more or less monitoring, exposing it to adverse selection. A multi-tier loan pricing facility arises as the optimal institutional design setting both the purchase price and banks' risk retention for given loan characteristics. This design dominates uniform (flat) structure for loan purchases, provides the right incentives to banks and achieves maximum lending at lower rates to businesses. Both the multi-tier and flat structures deliver welfare gains compared to no intervention, but the relative gain between the two depends on three sufficient statistics: the share of loans requiring monitoring, the risk-retention ratio, and the liquidity premium.
We study the evolution of the price discovery process in the euro-dollar and dollar-yen currency pairs over a ten-year period on the EBS platform, a global trading venue used by both manual and automated traders. We find that the importance of market orders decreases sharply over that period, owing mainly to a decline in the information share from manual trading, while the information share of market orders from algorithmic and high-frequency traders remains fairly constant. At the same time, there is a substantial, but gradual, increase in the information share of limit orders. Price discovery also becomes faster, suggesting improvements in market efficiency. The results are consistent with theoretical predictions that in more efficient markets, informed traders tend to use more limit orders.
Keywords: High-frequency trading, Limit orders, Price discovery
The Effect of the Central Bank Liquidity Support during Pandemics: Evidence from the 1918 Spanish Influenza Pandemic
The coronavirus outbreak raises the question of how central bank liquidity support affects financial stability and promotes economic recovery. Using newly assembled data on cross-county flu mortality rates and state-charter bank balance sheets in New York, we investigate the effects of the 1918 Influenza Pandemic on the banking system and the role of the Federal Reserve during the pandemic. We find that banks located in more severely affected areas experienced deposit withdrawals. Banks which were members of the Federal Reserve were able to access central bank liquidity and so continue or even expand lending. Banks which were not members, however, did not borrow on the interbank market but rather curtailed lending, suggesting there was little-to-no pass-through of central bank liquidity. Further, in the counties most affected by the 1918 Influenza, even banks with direct access to the discount window liquidated assets so as to meet large deposit withdrawals, suggesting limits to the effectiveness of the liquidity provision by the Federal Reserve. Finally, we show that the pandemic caused only a short-term disruption on the financial sector. Over the long-term, deposits returned and banks restored their asset portfolios.
Keywords: 1918 influenza, pandemics, financial stability, bank lending, economic recovery
We extract aggregate demand and supply shocks for the US economy from real-time survey data on inflation and real GDP growth using a novel identification scheme. Our approach exploits non-Gaussian features of macroeconomic forecast revisions and imposes minimal theoretical assumptions. After verifying that our results for U.S. post-World War II business cycle fluctuations are largely in line with the prevailing consensus, we proceed to study output and price fluctuations during the COVID-19 pandemic. We attribute two thirds of the decline in 2020:Q1 GDP to a negative shock to aggregate demand. In contrast, regarding the staggeringly large decline in GDP in 2020:Q2, we estimate two thirds of this shock was due to a reduction in aggregate supply. Statistical analysis suggests a slow recovery due to a persistent effects of the supply shock, but surveys suggest a somewhat faster rebound with a recovery in aggregate supply leading the way.
We develop a model where persistent trade shocks create demand for a basket- backed stablecoin, such as Mark Carney's "synthetic hegemonic currency" or Facebook's recent proposal for Libra. In numerical simulations, we find four main results. First, because of general equilibrium effects of the basket currency on the volatility of currency values, overall demand for that currency is small. Second, despite scant holdings of the basket, its global reach may contribute to substantial increases in welfare if the basket is widely accepted, allowing it to complement holdings of sovereign currencies. Third, we calculate the welfare maximizing composition of the basket, finding that optimal weights depend on the pattern of international acceptance, but that basket composition does not significantly affect welfare. Fourth, despite potential welfare improvements, low demand for the basket currency from buyers limits sellers' incentives to invest in accepting it, suggesting that fears of a so-called global stablecoin replacing domestic sovereign currencies may be overstated.
We examine the macroeconomic effect of large-scale asset purchases (LSAPs) and forward guidance (FG) using a proxy structural VAR estimated on data through 2015, where the stance of the LSAP policy is measured using primary dealer expectations of the Federal Reserve's asset holdings. Monetary policy shocks are identified using instruments constructed from event study yield changes, and additional assumptions are employed to separately identify LSAP and FG shocks. We find that unexpected expansions in the Federal Reserve's asset holdings during the ZLB period between 2008 and 2015 had significant expansionary effects on the macroeconomy, with real activity and inflation rising and unemployment declining notably following the shock. The policy accommodation appears to be transmitted to the economy both through financial markets—including Treasury yields, credit spreads and equity prices—and through bank lending. The effects on Treasury yields and term premiums appear to be longer-lived than previously documented, while the effects on credit spreads and especially bank lending also appear persistent. These results appear fairly robust to alternative identification and econometric methodologies, alternative policy indicators and instruments, and controlling for any possible Federal Reserve information effect. A counterfactual analysis shows that absent the LSAP3 program implemented between late 2012 and 2014, CPI inflation would have been about 1 percentage point lower, while the unemployment rate would have been about 4 percentage points higher, by the end of 2015.
We present the ivcrc command, which implements an instrumental variables (IV) estimator for the linear correlated random coefficients (CRC) model. This model is a natural generalization of the standard linear IV model that allows for endogenous, multivalued treatments and unobserved heterogeneity in treatment effects. The proposed estimator uses recent semiparametric identification results that allow for flexible functional forms and permit instruments that may be binary, discrete, or continuous. The command also allows for the estimation of varying coefficients regressions, which are closely related in structure to the proposed IV estimator. We illustrate this IV estimator and the ivcrc command by estimating the returns to education in the National Longitudinal Survey of Young Men.
Can Forecast Errors Predict Financial Crises? Exploring the Properties of a New Multivariate Credit Gap
Yes, they can. I propose a new method to detect credit booms and busts from multivariate systems -- monetary Bayesian vector autoregressions. When observed credit is systematically higher than credit forecasts justified by real economic activity variables, a positive credit gap emerges. The methodology is tested for 31 advanced and emerging market economies. The resulting credit gaps fit historical evidence well and detect turning points earlier, outperforming the credit-to-GDP gaps in signaling financial crises, especially at longer horizons. The results survive in real time and can shed light on the drivers of credit booms.
We examine how housing supply constraints affect housing affordability, which we define as the quality-adjusted price of housing services. In our dynamic model, supply constraints increase the price of housing services by only half has much as the purchase price of a home, since the purchase price responds to expected future increases in rent as well as contemporaneous rent increases. Households respond to changes in the price of housing services by altering their housing consumption and location choices, but only by a small amount. We evaluate these predictions using common measures of housing supply constraints and data from US metropolitan areas from 1980 to 2016. We find sizeable effects of supply constraints on house prices, but modest-to-negligible effects on rent, lot size, structure consumption, location choice within metropolitan areas, sorting across metropolitan areas, and housing expenditures. We conclude that housing supply constraints distort housing affor dability, and therefore housing consumption and location decisions, by less than their estimated effects on house prices suggest.
This paper develops a new combined wealth measure using data from the Survey of Consumer Finances, by augmenting data on net worth with estimates of defined benefit (DB) pension wealth and expected Social Security wealth. We use this combined wealth concept to explore retirement preparation among groups of households in their pre-retirement years (40-49 and 50-59) and also to explore the concentration of wealth. We find evidence of moderate, but rising, shortfalls in retirement preparation. We also show that including DB pension and Social Security wealth results in markedly lower measures of wealth concentration. Trends toward higher concentration over time are also somewhat moderated.
We find significant evidence of asymmetric information and signaling in post-crisis offerings in the auto asset-backed securities (ABS) market. Using granular regulatory reporting data, we are able to directly measure private information and quantify its effect on signaling and pricing. We show that lenders "self-finance'' unobservably higher-quality loans by holding these loans for longer periods to signal private information. This signal is priced in initial offerings of auto ABS and accurately predicts ex-post loan performance. We also demonstrate that our results are robust to exogenous shifts in the demand and supply of auto loans. Despite an environment of post-crisis enhanced transparency and securitization standards, signaling may be motivated by inattentive investors and regulations enforcing "no adverse selection'' in constructing ABS.
We use an expected utility framework to examine how living standards vary across the United States and how each state's living standards have evolved over time. Our welfare measure accounts for cross-state variations in mortality, consumption, education, inequality, and cost of living. We find that per capita income is a good indicator of living standards, with a correlation of 0.80 across states. Living standards in most states, however, appear closer to those in the richest states than their difference in per capita income would suggest. Whereas high-income states benefit from higher life expectancy, consumption, and college attainment, low-income states benefit from lower cost of living. All states experienced positive welfare growth, and hence rising living standards, between 1999 and 2015. The annual welfare growth rate, however, varied from 1.38 to 3.76 percent across states due to varying gains in life expectancy, consumption, and college attainment, with life ex pectancy accounting for 50.3 percent of the variation. Finally, the growth rate of per capita income is a poor proxy for how fast living standards are rising in a particular state since the correlation between welfare growth and per capita income growth is only 0.38, and deviations are often large.
We study the composition of bank loan portfolios during the transition of the real sector to a knowledge economy where firms increasingly use intangible capital. Exploiting heterogeneity in bank exposure to the compositional shift from tangible to intangible capital, we show that exposed banks curtail commercial lending and reallocate lending to other assets, such as mortgages. We estimate that the substantial growth in intangible capital since the mid-1980s explains around 30% of the secular decline in the share of commercial lending in banks' loan portfolios. We provide suggestive evidence that this reallocation increased the riskiness of banks' mortgage lending.
We zero in on the expected returns of long-short portfolios based on 120 stock market anomalies by accounting for (1) effective bid-ask spreads, (2) post-publication effects, and (3) the modern era of trading technology that began in the early 2000s. Net of these effects, the average anomaly's expected return is a measly 8 bps per month. The strongest anomalies return only 10-20 bps after accounting for data-mining with either out-of-sample tests or empirical Bayesian methods. Expected returns are negligible despite cost optimizations that produce impressive net returns in-sample and the omission of additional trading costs like price impact.
We use machine learning methods to examine the power of Treasury term spreads and other financial market and macroeconomic variables to forecast US recessions, vis-à-vis probit regression. In particular we propose a novel strategy for conducting cross-validation on classifiers trained with macro/financial panel data of low frequency and compare the results to those obtained from standard k-folds cross-validation. Consistent with the existing literature we find that, in the time series setting, forecast accuracy estimates derived from k-folds are biased optimistically, and cross-validation strategies which eliminate data "peeking" produce lower, and perhaps more realistic, estimates of forecast accuracy. More strikingly, we also document rank reversal of probit, Random Forest, XGBoost, LightGBM, neural network and support-vector machine classifier forecast performance over the two cross-validation methodologies. That is, while a k-folds cross-validation indicates tha t the forecast accuracy of tree methods dominates that of neural networks, which in turn dominates that of probit regression, the more conservative cross-validation strategy we propose indicates the exact opposite, and that probit regression should be preferred over machine learning methods, at least in the context of the present problem. This latter result stands in contrast to a growing body of literature demonstrating that machine learning methods outperform many alternative classification algorithms and we discuss some possible reasons for our result. We also discuss techniques for conducting statistical inference on machine learning classifiers using Cochrane's Q and McNemar's tests; and use the SHapley Additive exPlanations (SHAP) framework to decompose US recession forecasts and analyze feature importance across business cycles.
A risk factor linked to aggregate equity issuance conditions explains the empirical performance of investment factors based on the asset growth anomaly of Cooper, Gulen, and Schill (2008). This new risk factor, dubbed equity ﬁnancing risk (EFR) factor, subsumes investment factors in leading linear factor models. Most importantly, when substituted for investment factors, the EFR factor improves the overall pricing performance of linear factor models, delivering a signiﬁcant reduction in absolute pricing errors and their associated t-statistics for several anomalies, including the ones related to R&D expenditures and cash-based operating proﬁtability.
We study the effects of enrichment activities such as reading, homework, and extracurricular lessons on children's cognitive and non-cognitive skills. We take into consideration that children forgo alternative activities, such as play and socializing, in order to spend time on enrichment. Our study controls for selection on unobservables using a novel approach which leverages the fact that many children spend zero hours per week on enrichment activities. At zero enrichment, confounders vary but enrichment does not, which gives us direct information about the effect of confounders on skills. Using time diary data available in the Panel Study of Income Dynamics (PSID), we find that the net effect of enrichment is zero for cognitive skills and negative for non-cognitive skills, which suggests that enrichment may be crowding out more productive activities on the margin. The negative effects on non-cognitive skills are concentrated in higher-income students in high school, consistent with elevated academic competition related to college admissions.
I estimate a medium-scale New-Keynesian model and relax the conventional assumption that the central bank adopted an active monetary policy by pursuing inflation and output stability over the entire post-war period. Even after accounting for a rich structure, I find that monetary policy was passive prior to the Volcker disinflation. Sunspot shocks did not represent quantitatively relevant sources of volatility. By contrast, such passive interest rate policy accommodated fundamental productivity and cost shocks that de-anchored inflation expectations, propagated via self-fulfilling inflation expectations and constituted the primary sources of the run-up in inflation from the 1960s through the late 1970s.
Keywords: Monetary policy, business cycle, expectations, indeterminacy, Bayesian methods
This paper studies the role of banks' branching networks in propagating the oil shocks. Banks that were exposed to the oil shocks through their operations in oil-concentrated counties experienced a liquidity drainage in the form of a declining amount of demand deposit inflow as well as an increasing percentage of troubled loans. Banks were forced to sell liquid assets, and contracted lending to small businesses and mortgage borrowers in counties that were not directly affected by the oil shocks. The effect is magnified when banks do not have strong community ties, but is mitigated if banks' branching network is sufficiently dispersed. I also find the decline in local credit supply cannot be completely offset by healthy competing banks' increased lending, providing fresh evidence from the perspective of bank competition.
The United States government spends billions on public health insurance and has funded a number of programs to build health care facilities. However, the government runs these two types of programs separately: in different places, at different times, and for different populations. We explore whether access to both health insurance and hospitals can improve health outcomes and access to health care. We analyze a coal mining union health insurance program in 1950s Appalachia with and without a complementary hospital construction program. Our results show that the union insurance alone increased hospital births and reduced infant mortality. Once the union hospitals opened, however, the insurance and the hospitals together substantially increased the net amount of hospital beds and health care employees, with limited crowd-out of existing private hospitals. Our results suggest that hospitals can complement health insurance in underserved areas.
Monetary policy uncertainty affects the transmission of monetary policy shocks to longer-term nominal and real yields. For a given monetary policy shock, the reaction of yields is more pronounced when the level of monetary policy uncertainty is low. Primary dealers and other investors adjust their interest rate positions more when monetary policy uncertainty is low than when uncertainty is high. These portfolio adjustments likely explain the larger pass-through of a monetary policy shock to bond yields when uncertainty is low. These findings shed new light on the role that monetary policy uncertainty plays in the transmission of monetary policy to financial markets.
Keywords: Monetary policy surprises, monetary policy uncertainty, interest rates, primary dealers
Drastic public health measures such as social distancing or lockdowns can reduce the loss of human life by keeping the number of infected individuals from exceeding the capacity of the health care system but are often criticized because of the social and the economic cost they entail. We question this view by combining an epidemiological model, calibrated to capture the spread of the COVID-19 virus, with a multisector model, designed to capture key characteristics of the U.S. Input Output Tables. Our two-sector model features a core sector that produces intermediate inputs not easily replaced by inputs from the other sector, subject to minimum-scale requirements. We show that, by affecting workers in this core sector, the high peak of an infection not mitigated by social distancing may cause very large upfront economic costs in terms of output, consumption and investment. Social distancing measures can reduce these costs, especially if skewed towards non-core industries and occupations with tasks that can be performed from home, helping to smooth the surge in infections among workers in the core sector.
Keywords: Infectious disease, epidemic, recession, COVID-19.
Many traditional official statistics are not suitable for measuring high-frequency developments that evolve over the course of weeks, not months. In this paper, we track the labor market effects of the COVID-19 pandemic with weekly payroll employment series based on microdata from ADP. These data are available essentially in real-time, and allow us to track both aggregate and industry effects. Cumulative losses in paid employment through April 4 are currently estimated at 18 million; just during the two weeks between March 14 and March 28 the U.S. economy lost about 13 million paid jobs. For comparison, during the entire Great Recession less than 9 million private payroll employment jobs were lost. In the current crisis, the most affected sector is leisure and hospitality, which has so far lost or furloughed about 30 percent of employment, or roughly 4 million jobs.
Banks in the United States originate $100 billion in community development loans every year and hold a similar amount of community development investments on their balance sheets. A number of federal place-based policies encourage the provision of these loans and investments to promote growth, employment and the availability of affordable housing to disadvantaged communities. Research into the effectiveness of privately supplied community development financing has been hampered, however, by the lack of comprehensive data on banks' community development activities at a local level. Hand collected data from thousands of Community Reinvestment Act performance evaluations fill this gap. Using these data, the effect of the supply of community development funding on local economic outcomes is estimated. Endogeneity of community development financing to local demand factors is addressed, exploiting the fact that banks exhibit fixed tendencies to engage in community developme nt financing across markets. Shifts in the share of local deposit markets toward banks with a greater tendency to supply community development loans are associated with subsequent expansion in total employment and wages paid. Estimates suggest $56,000 in community development lending is required to create one job, on net. There is no measurable effect on the supply of affordable housing or the growth of house prices. Counties experiencing a shift in local deposit market shares toward community development intensive banks were on similar pre-trends as the rest of the country in the years prior to the shift, as measured across a range of economic and credit market outcomes.
The seeds of financial imbalances are sown in times of buoyant economic growth. We study the link between macroeconomic performance and financial imbalances, focusing on the experience of the United States since the 1960s. We first follow a narrative approach to review historical episodes of significant financial imbalances and find that the onset of financial disturbances typically occurs when the economy is running hot. We then look for evidence of a statistical link between measures of macroeconomic conditions and financial imbalances. In our in-sample analysis, we find that strong economic growth is followed by a build-up of financial imbalances across all dimensions of the National Financial Conditions Index. In our out-of-sample analysis, we find that the link between strong economic performance and increases in nonfinancial leverage is particularly strong and robust. Using a structural VAR identified with narrative sign restrictions, we also demonstrate that business cycle shocks are important drivers of non financial leverage.
Keywords: Economic Performance; Nonfinancial Leverage; Financial Imbalances; Financial Stability; Forecasting; VARs; Sign Restrictions; Narrative.
We develop a dynamic decomposition of the empirical Beveridge curve, i.e., the level of vacancies conditional on unemployment. Using a standard model, we show that three factors can shift the Beveridge curve: reduced-form matching efficiency, changes in the job separation rate, and out-of-steady-state dynamics. We find that the shift in the Beveridge curve during and after the Great Recession was due to all three factors, and each factor taken separately had a large effect. Comparing the pre-2010 period to the post-2010 period, a fall in matching efficiency and out-of-steady-state dynamics both pushed the curve upward, while the changes in the separations rate pushed the curve downward. The net effect was the observed upward shift in vacancies given unemployment. In previous recessions changes in matching efficiency were relatively unimportant, while dynamics and the separations rate had more impact. Thus, the unusual feature of the Great Recession was the deterioration in matching efficiency, while separations and dynamics have played significant, partially offsetting roles in most downturns. The importance of these latter two margins contrasts with much of the literature, which abstracts from one or both of them. We show that these factors affect the slope of the empirical Beveridge curve, an important quantity in recent welfare analyses estimating the natural rate of unemployment.
We synthesize the recent, at times conflicting, empirical literature regarding whether fiscal policy is more effective during certain points in the business cycle. Evidence of state dependence in the multiplier depends critically on how the business cycle is defined. Estimates of the fiscal multiplier do not change when the unemployment rate is above or below its trend. However, we find that the multiplier is higher when the unemployment rate is increasing relative to when it is decreasing. This result holds using both a long time-series at the U.S. national level and for a panel of U.S. states.
Even though workers in the UK spent just 1,000 pounds on commuting in 2017, the economic loss may be far higher because of the congestion externality arising from the way in which one worker's commute affects the commuting time of others. I provide empirical evidence that commuting time affects job acceptance, pointing to large indirect costs of congestion. To interpret the empirical facts and quantify the costs of congestion, I build a model featuring a frictional labor market within a metropolitan area. By endogenizing commuting congestion in a labor search model, the model connects labor market responses to urban policies. Workers evaluate job offers based on their productivity and commuting costs, taking congestion as given, but by accepting and commuting to distant jobs, affect other workers' labor market outcomes. Through this mechanism, equilibrium moving decisions, housing rent, and wages are tightly linked to congestion. Calibrating the model to the local la bor market around London, I show that the effect of the congestion externality is to significantly decrease welfare and increase wage inequality. I quantify the effects of a congestion tax on labor market outcomes, and show that the welfare-maximizing tax has substantial negative effects on inequality, but comes at a cost of higher unemployment.
We use a dynamic factor model to disentangle changes in prices due to economy-wide (common) shocks, from changes in prices due to idiosyncratic shocks. Using 146 disaggregated individual price series from the U.S. PCE price index, we find that most of the fluctuations in core PCE prices observed since 2010 have been idiosyncratic in nature. Moreover, we find that common core inflation responds to economic slack, while the idiosyncratic component does not. That said, even after filtering out idiosyncratic factors, the estimated Phillips curve is extremely flat post-1995. Therefore, our results suggest that the flattening of the Phillips curve is the result of macroeconomic forces.
This paper illustrates the usefulness of sequential Monte Carlo (SMC) methods in approximating DSGE model posterior distributions. We show how the tempering schedule can be chosen adaptively, document the accuracy and runtime benefits of generalized data tempering for "online" estimation (that is, re-estimating a model asnew data become available), and provide examples of multimodal posteriors that are well captured by SMC methods. We then use the online estimation of the DSGE model to compute pseudo-out-of-sample density forecasts and study the sensitivity ofthe predictive performance to changes in the prior distribution. We find that making priors less informative (compared to the benchmark priors used in the literature) by increasing the prior variance does not lead to a deterioration of forecast accuracy.
We describe the Federal Reserve's (the Fed's) approach to implementing monetary policy in an ample-reserves regime. We use a stylized model to explain the factors the Fed considers and the tools it uses to ensure interest rate control when the quantity of reserves is ample. Then, we take a close look at the Fed's experience operating in this regime in the post-crisis period, both as it has raised and lowered its policy rate. Looking ahead, we highlight some considerations relevant for maintaining a level of reserves consistent with the efficient and effective implementation of monetary policy, and conclude with an overview of the benefits of an ample-reserves regime. This primer is intended to enhance discussions and understanding of the Fed's actions and communications regarding monetary policy implementation, as many resources on this topic may be out of date given the recent evolution of the policy environment.
Note: This paper was reposted on March 6, 2020, to correct typos on pages 22 and 30.
Consumer digital access services—internet, mobile phone, cable TV, and streaming—accounted for over 2 percent of U.S. household consumption in 2018. We construct prices for these services using direct measures of volume (data transmitted, talk time, and hours of programming). Our price index fell 12 percent per year from 1988 to 2018 while official prices moved up modestly. Using our digital services index, we estimate total personal consumption expenditure (PCE) prices have risen nearly 1/2 percentage point slower than the official index since 2008. Importantly, the spread between alternative and official PCE price inflation has increased noticeably over time.
Original paper: PDF
Methods of monetary policy implementation continue to change. The level of reserve supply—scarce, abundant, or somewhere in between—has implications for the efficiency and effectiveness of an implementation regime. The money market events of September 2019 highlight the need for an analytical framework to better understand implementation regimes. We discuss major issues relevant to the choice of an implementation regime, using a parsimonious framework and drawing from the experience in the United States since the 2007-09 financial crisis. We find that the optimal level of reserve supply likely lies somewhere between scarce and abundant reserves, thus highlighting the benefits of implementation with what could be called "ample" reserves. The Federal Reserve's announcement in October 2019 that it would maintain a level of reserve supply greater than the one that prevailed in early September is consistent with the implications of our framework.
Firms often make production decisions before meeting a buyer. We incorporate this often-overlooked fact into an otherwise standard monetary search model and show that it has important implications for the set of equilibria, efficiency, and the cost of inflation. Our model features a strategic complementarity between the buyers' ex ante choice of money balances and sellers' ex ante choice of productive capacity. When resale value of unsold inventories is high, sellers carry excess capacity and the equilibrium is unique. But, when resale value is low, there is a continuum of equilibria, all of which are inefficient and welfare-ranked. Effects of inflation are highly nonlinear. When inflation is high, the buyer's money holdings bind, and inflation therefore reduces trade through a standard real-balance channel. When inflation is low, the seller's capacity constraint binds, real balances have no effect at the margin, and inflation has no effect on output or welfare.
Technology has changed how discrimination manifests itself in financial services. Replacing human discretion with algorithms in decision-making roles reduces taste-based discrimination, and new modeling techniques have expanded access to financial services to households who were previously excluded from these markets. However, algorithms can exhibit bias from human involvement in the development process, and their opacity and complexity can facilitate statistical discrimination inconsistent with antidiscrimination laws in several aspects of financial services provision, including advertising, pricing, and credit-risk assessment. In this chapter, we provide a new amalgamation and analysis of these developments, identifying five gateways whereby technology induces discrimination to creep into financial services. We also consider how these technological changes in finance intersect with existing discrimination and data privacy laws, leading to our contribution of four fron tlines of regulation. Our analysis concludes that the net effect of innovation in technological finance on discrimination is ambiguous and depends on the future choices made by policymakers, the courts, and firms.
Keywords: Discrimination, fair lending, statistical discrimination, FinTech, taste-based preferences, algorithmic decision-making, proxy variables, big data
We confront two seemingly-contradictory observations about the US labor market: the rate at which workers change employers has declined since the 1980s, yet there is a commonly expressed view that long-term employment relationships are more difficult to attain. We reconcile these observations by examining how the distribution of employment tenure has changed in aggregate and for various demographic groups. We show that the fraction of workers with short tenure (less than a year) has been falling since the 1980s, consistent with the decline in job changing. Meanwhile, the fraction of workers with long tenure (20 years or more) has been rising modestly owing to an increase in long tenure for women and the ageing of the population. Long tenure has declined markedly among older men; this trend may have spurred popular perceptions that long-term employment is less common than in the past. The decline in long-tenure for men appears due to an increase in mid-career separa tions that reduce the likelihood of reaching long-tenure, rather than an increase in late-career separations. Nevertheless, survey evidence indicates that these changes in employment relationships are not associated with heightened concerns about job insecurity or decreases in job satisfaction as reported by workers. The decline in short-tenure is widespread, associated with fewer workers cycling among briefly-held jobs, and coincides with an increase in perceived job security among short tenure workers.
We evaluate how a country’s governance structure for macroprudential policy affects its implementation of Basel III macroprudential capital buffers. We find that the probabilities of using the countercyclical capital buffer (CCyB) are higher in countries that have financial stability committees (FSCs) with stronger governance mechanisms and fewer agencies, which reduces coordination problems. These higher probabilities are more sensitive to credit growth, consistent with the CCyB being used to mitigate systemic risk. A country’s probability of using the CCyB is even higher when the FSC or ministry of finance has direct authority to set the CCyB, perhaps because setting the CCyB involves establishing a new macro-financial analytical process to regularly assess systemic risks and allows these new entities to influence the process. These results are consistent with elected officials creating the FSCs with the strongest governance and fewer agencies for functional delegati on reasons, but most FSCs are created for symbolic political reasons.
This technical note describes the Forward-Looking Analysis of Risk Events (FLARE) model, which is a top-down model that helps assess how well the banking system is positioned to weather exogenous macroeconomic shocks. FLARE estimates banking system capital under varying macroeconomic scenarios, time horizons, and other systemic shocks.
Note: On February 14, 2020, this paper was updated to include additional acknowledgements.
Standard structural VAR models and estimation using Romer and Romer (2004) monetary policy shocks show that, in samples after the 1980s, a contractionary conventional monetary policy shock generates smaller and sometimes perversely-signed impulse responses compared to earlier samples. Using insights from the central bank information effects literature, we show that the analyses producing these results suffer from an omitted variables problem related to forward-looking information emanating from Federal Reserve forecasts. Transmission of conventional monetary policy shocks takes on the standard signs, and is typically significant, once Fed forward-looking information is taken into account. This reconciliation does not follow from adding private sector forecasts to the estimation frameworks.
We investigate how macroeconomic drivers affect the predictive inflation distribution as well as the probability that inflation will run above or below certain thresholds over the near term. This is what we refer to as Inflation-at-Risk–a measure of the tail risks to the inflation outlook. We find that the recent muted response of the conditional mean of inflation to economic conditions does not convey an adequate representation of the overall pattern of inflation dynamics. Analyzing data from the 1970s reveals ample variability in the conditional predictive distribution of inflation that remains even when focusing on the post-2000 period of stable and low mean inflation. We also document that in the United States and in the Euro Area tight financial conditions carry substantial downside inflation risks, a feature overlooked by much of the literature. Our paper offers a new empirical perspective to existing macroeconomic models, showing that changes in credit conditions are also key to understand the dynamics of the inflation tails.
Keywords: Quantile Regression, Inflation Risks.
What is the output gap and when do we know it? A factor stochastic volatility model estimates the common component to forecasts of the output gap produced by the staff of the Federal Reserve, its time-varying volatility, and time-varying, horizon-specific forecast uncertainty. The common factor to these forecasts is highly procyclical, and unexpected increases to the common factor are associated with persistent responses in other macroeconomic variables. However, output gap estimates are very uncertain, even well after the fact. Output gap uncertainty increases around business cycle turning points. Lastly, increased macroeconomic uncertainty, as measured by the output gap's time-varying volatility, produces pronounced negative responses to other macroeconomic variables.
Evaluating the Success of President Johnson's War on Poverty: Revisiting the Historical Record Using a Full-Income Poverty Measure
We evaluate progress in President's Johnson's War on Poverty. We do so relative to the scientifically arbitrary but policy relevant 20 percent baseline poverty rate he established for 1963. No existing poverty measure fully captures poverty reductions based on the standard that President Johnson set. To fill this gap, we develop a Full-income Poverty Measure with thresholds set to match the 1963 Official Poverty Rate. We include cash income, taxes, and major in-kind transfers and update poverty thresholds for inflation annually. While the Official Poverty Rate fell from 19.5 percent in 1963 to 12.3 percent in 2017, our Full-income Poverty Rate based on President Johnson's standards fell from 19.5 percent to 2.3 percent over that period. Today, almost all Americans have income above the inflation-adjusted thresholds established in the 1960s. Although expectations for minimum living standards evolve, this suggests substantial progress combatting absolute poverty since the War on Poverty began.
Local projections (LPs) are a popular tool in applied macroeconomic research. We survey the related literature and find that LPs are often used with very small samples in the time dimension. With small sample sizes, given the high degree of persistence in most macroeconomic data, impulse responses estimated by LPs can be severely biased. This is true even if the right-hand-side variable in the LP is iid, or if the data set includes a large cross-section (i.e., panel data). We derive a simple expression to elucidate the source of the bias. Our expression highlights the interdependence between coefficients of LPs at different horizons. As a byproduct, we propose a way to bias-correct LPs. Using U.S. macroeconomic data and identified monetary policy shocks, we demonstrate that the bias correction can be large.
By stepping between bilateral counterparties, a central counterparty (CCP) transforms credit exposure. CCPs generally improve financial stability. Nevertheless, large CCPs are by nature concentrated and interconnected with major global banks. Moreover, although they mitigate credit risk, CCPs create liquidity risks, because they rely on participants to provide cash. Such requirements increase with both market volatility and default; consequently, CCP liquidity needs are inherently procyclical. This procyclicality makes it more challenging to assess CCP resilience in the rare event that one or more large financial institutions default. Liquidity-focused macroprudential stress tests could help to assess and manage this systemic liquidity risk.
Treasury securities normally possess unparalleled safety and liquidity and, consequently, carry a money premium. We use recent debt limit impasses, which temporarily increased the riskiness of Treasuries, to investigate the relationship between the money premium, safety, and liquidity. Our results shed light on Treasury market dynamics specifically, and debt more generally. We first establish that a decline in the perceived safety of Treasuries erodes the money premium at all times. Meanwhile, changes in liquidity only affected the money premium during the impasses. Next, we show that Treasury safety and liquidity dynamics are generally consistent with the theory of the information sensitivity of debt.
We test for racial discrimination in the prices charged by mortgage lenders. We construct a unique dataset where we observe all three dimensions of a mortgage's price: the interest rate, discount points, and fees. While we find statistically significant gaps by race and ethnicity in interest rates, these gaps are offset by differences in discount points. We trace out point-rate schedules and show that minorities and whites face identical schedules, but sort to different locations on the schedule. Such sorting may reflect systematic differences in liquidity or preferences. Finally, we find no differences in total fees by race or ethnicity.
Global and local methods are widely used in international macroeconomics to analyze incomplete-markets models. We study solutions for an endowment economy, an RBC model and a Sudden Stops model with an occasionally binding credit constraint. First-order, second-order, risky steady state and DynareOBC solutions are compared v. fixed-point-iteration global solutions in the time and frequency domains. The solutions differ in key respects, including measures of precautionary savings, cyclical moments, impulse response functions, financial premia and macro responses to credit constraints, and periodograms of consumption, foreign assets and net exports. The global method is easy to implement and faster than local methods for the endowment model. Local methods are faster for the RBC model and the global and DynareOBC solutions are of comparable speed. These findings favor global methods except when prevented by the curse of dimensionality and urge caution when using local methods. Of the latter, first-order solutions are preferable because results are very similar to second-order methods.
This paper explores the microfoundations of consumption models and quantifies the macro implications of consumption heterogeneity. We propose a new empirical method to estimate the response of consumption to permanent and transitory income shocks for different groups of households. We then apply this method to administrative data from Denmark. The large sample size, along with detailed household balance sheet information, allows us to finely divide the population along relevant dimensions. We find that households that stand to lose from an interest rate hike are significantly more responsive to income shocks than those that stand to gain. Following a 1-percentage-point interest rate increase, we estimate that consumption growth decreases by a 1/4 percentage point through this interest rate exposure channel alone, making this channel substantially larger than the intertemporal substitution channel that is at the core of representative agent New Keynesian models.
Confidence, financial literacy and investment in risky assets: Evidence from the Survey of Consumer Finances
We employ recent Survey of Consumer Finances (SCF) microdata from the US to analyze the impacts of confidence in one's own financial knowledge, confidence in the economy, and objective financial literacy on investment in risky financial assets (equity and bonds) on both the extensive and intensive margins. Controlling for a rich set of covariates including risk aversion, we find that objective financial literacy is positively related to investment in risky assets as well as debt securities. Moreover, confidence in own financial skills additionally increases the probability of holding risky assets and bonds. While these relationships are rather robust for the extensive margin, they break down with regard to the conditional share of financial wealth in risky assets of those who actually hold them. The relevance of financial literacy as well as confidence varies considerably with the distribution of wealth as well as across several socio-economic dimensions such as age, education and race.
This paper uses aggregate data to estimate and evaluate a behavioral New Keynesian (NK) model in which households and firms plan over a finite horizon. The finite-horizon (FH) model outperforms rational expectations versions of the NK model commonly used in empirical applications as well as other behavioral NK models. The better fit of the FH model reflects that it can induce slow-moving trends in key endogenous variables which deliver substantial persistence in output and inflation dynamics. In the FH model, households and firms are forward-looking in thinking about events over their planning horizon but are backward looking regarding events beyond that point. This gives rise to persistence without resorting to additional features such as habit persistence and price contracts indexed to lagged inflation. The parameter estimates imply that the planning horizons of most households and firms are less than two years which considerably dampens the effects of expected fut ure changes of monetary policy on the macroeconomy.
Accessible materials (.zip)
Keywords: Finite-horizon planning, learning, monetary policy, New Keynesian model, Bayesian estimation.
This paper examines whether monetary policy pass-through to mortgage interest rates affects household fertility decisions. Using administrative data on mortgages and births in the UK, our empirical strategy exploits variation in the timing of when families were eligible for a rate adjustment, coupled with the large reductions in the monetary policy rate that occurred during the Great Recession. We estimate that each 1 percentage point drop in the policy rate increased birth rates by 2 percent. In aggregate, this pass-through of accommodative monetary policy to mortgage rates was sufficiently large to outweigh the headwinds of the Great Recession and prevent a "baby bust" in the UK, in contrast to the US. Our results provide new evidence on the nature of monetary policy transmission to households and suggest a new mechanism via which mortgage contract structures can affect both aggregate demand and supply.
Accessible materials (.zip)
We apply textual analysis tools to the narratives that accompany Federal Reserve Board economic forecasts to measure the degree of optimism versus pessimism expressed in those narratives. Text sentiment is strongly correlated with the accompanying economic point forecasts, positively for GDP forecasts and negatively for unemployment and inflation forecasts. Moreover, our sentiment measure predicts errors in FRB and private forecasts for GDP growth and unemployment up to four quarters out. Furthermore, stronger sentiment predicts tighter than expected monetary policy and higher future stock returns. Quantile regressions indicate that most of sentiment's forecasting power arises from signaling downside risks to the economy and stock prices.
Accessible materials (.zip)