FEDS 2000-53
Estimating the Value and Interest Rate Risk of Interest-Bearing Transactions Deposits


A valuation model is developed within an interest rate contingent claims framework to estimate NOW account and MMDA premiums and interest rate risk for a sample of commercial banks. As has been previously done, bank deposit rate and balances dynamics are represented by autoregressive processes but with attention given here to alternative specifications and to the deposit rent processes and dynamics implied by these specifications. Alternative deposit rate specifications studied include asymmetric adjustment to market rate changes. In examining the implied deposit rent processes, special attention is given to the importance of distant rent forecasts and forecast dynamics for the deposit premium and interest rate risk estimates.

Keywords: Transactions deposits, valuation, interest rate risk

FEDS 2000-52
Do the Rich Save More?

Karen E. Dynan, Jonathan Skinner, and Stephen P. Zeldes


The issue of whether higher lifetime income households save a larger fraction of their income is an important factor in the evaluation of tax and macroeconomic policy. Despite an outpouring of research on this topic in the 1950s and 1960s, the question remains unresolved and has since received little attention. This paper revisits the issue, using new empirical methods and the Panel Study on Income Dynamics, the Survey of Consumer Finances, and the Consumer Expenditure Survey. We first consider the various ways in which life cycle models can be altered to generate differences in saving rates by income groups: differences in Social Security benefits, different time preference rates, non-homothetic preferences, bequest motives, uncertainty, and consumption floors. Using a variety of instruments for lifetime income, we find a strong positive relationship between personal saving rates and lifetime income. The data do not support theories relying on time preference rates, non-homothetic preferences, or variations in Social Security benefits. Instead, the evidence is consistent with models in which precautionary saving and bequest motives drive variations in saving rates across income groups. Finally, we illustrate how models that assume a constant rate of saving across income groups can yield erroneous predictions.

Keywords: Saving, consumption, consumer spending, household behavior, tax policy

FEDS 2000-51
Monetary Policy When the Nominal Short-Term Interest Rate is Zero

James Clouse, Dale Henderson, Athanasios Orphanides, David Small, and Peter Tinsley


In an environment of low inflation, the Federal Reserve faces the risk that it has not provided enough monetary stimulus even when it has pushed the short-term nominal interest rate to its lower bound of zero. Assuming the nominal Treasury-bill rate has been lowered to zero, this paper considers whether further open market purchases of Treasury bills could spur aggregate demand through increases in the monetary base that may stimulate aggregate demand by increasing liquidity for financial intermediaries and households; by affecting expectations of the future paths of short-term interest rates, inflation, and asset prices; or by stimulating bank lending through the credit channel. This paper also examines the alternative policy tools that are available to the Federal Reserve in theory, and notes the practical limitations imposed by the Federal Reserve Act, The tools the Federal Reserve has at its disposal include open market purchases of Treasury bonds and private-sector credit instruments (at least those that may be purchased by the Federal Reserve); unsterilized and sterilized intervention in foreign exchange; lending through the discount window; and, perhaps in some circumstances, the use of options.

Keywords: Monetary policy, liquidity trap, Federal Reserve Act, open market operations, discount window lending

FEDS 2000-50
Pre-Announcement Effects, News, and Volatility: Monetary Policy and the Stock Market

Antulio N. Bomfim


I examine pre-announcement and news effects on the stock market in the context of public disclosure of monetary policy decisions. The results suggest that the stock market tends to be relatively quiet--conditional volatility is abnormally low--on days preceding regularly scheduled policy announcements. Although this calming effect is routinely reported in anecdotal press accounts, it is statistically significant only over the past four to five years, a result that I attribute to changes in the Federal Reserve's disclosure practices in early 1994. The paper also looks at how the actual interest rate decisions of policy makers affect stock market volatility. The element of surprise in such decisions tends to boost stock market volatility significantly in the short run, and positive surprises--higher-than-expected values of the target federal funds rate--tend to have a larger effect on volatility than negative surprises. The implications of the results for broader issues in the finance literature are also discussed.

Keywords: Uncertainty, information, announcements, expectations GARCH, risk

FEDS 2000-49
The Flypaper Effect Unstuck: Evidence on Endogenous Grants from The Federal Highway Aid Program

Brian Knight


Contrary to simple theoretical predictions, previous empirical research has found that state government public spending is increased far more, often dollar-for-dollar, by federal grant receipts than by equivalent increases in constituent private income. This anomaly has come to be known as the flypaper effect. First, a legislative bargaining model developed in this paper provides a critique of this empirical finding. The model demonstrates a positive correlation between constituent preferences for public goods and intergovernmental grant receipts, and this correlation has likely biased the existing literature towards finding a flypaper effect. The model also motivates using measures of the political power of state congressional delegations as an instrument for grant receipts. Second, after correcting for the endogeneity of grant receipts, the results demonstrate that constituent private income and grants have similar effects on public spending.

Keywords: Flypaper effect, federalism, political economy

FEDS 2000-48
Bankruptcy in General Equilibrium

Tarun Sabarwal


In this paper, I construct a model of an exchange economy in which bankruptcy arises in a manner similar to what we observe. This model is a more realistic representation of some markets in which intertemporal assets are traded. Using standard and natural assumptions, I show that every economy represented by this model has an equilibrium. Using examples, I highlight some welfare effects of bankruptcy.

Keywords: Bankruptcy, general equilibrium, incomplete markets, exemption, credit limit

FEDS 2000-47
Parameterizing Credit Risk Models With Rating Data

Mark Carey and Mark Hrycay


Estimates of average default probabilities for borrowers assigned to each of a financial institution's internal credit risk rating grades are crucial inputs to portfolio credit risk models. Such models are increasingly used in setting financial institution capital structure, in internal control and compensation systems, in asset-backed security design, and are being considered for use in setting regulatory capital requirements for banks. This paper empirically examines properties of the major methods currently used to estimate average default probabilities by grade. Evidence of potential problems of bias, instability, and gaming is presented. With care, and perhaps judicious application of multiple methods, satisfactory estimates may be possible. In passing, evidence is presented about other properties of internal and rating-agency ratings.

Keywords: Credit risk, value at risk, credit ratings, debt default, capital regulation

FEDS 2000-46
Empirical Evidence on Human Capital Spillovers


This paper examines whether the average level of human capital in a region affects the earnings of an individual residing in that region in a manner that is external to the individual's own human capital. I find little evidence of an external effect of human capital, which suggests that human capital spillovers of the form postulated by the new growth literature are unlikely to matter much in practice.

Keywords: Human capital externalities, endogenous growth

FEDS 2000-45
A Study of the Finite Sample Properties of EMM, GMM, QMLE, and MLE for a Square-Root Interest Rate Diffusion Model

Hao Zhou


This paper performs a Monte Carlo study on Efficient Method of Moments (EMM), Generalized Method of Moments (GMM), Quasi-Maximum Likelihood Estimation (QMLE), and Maximum Likelihood Estimation (MLE) for a continuous-time square-root model under two challenging scenarios--high persistence in mean and strong conditional volatility--that are commonly found in estimating the interest rate process. MLE turns out to be the most efficient of the four methods, but its finite sample inference and convergence rate suffer severely from approximating the likelihood function, especially in the scenario of highly persistent mean. QMLE comes second in terms of estimation efficiency, but it is the most reliable in generating inferences. GMM with lag-augmented moments has overall the lowest estimation efficiency, possibly due to the ad hoc choice of moment conditions. EMM shows an accelerated convergence rate in the high volatility scenario, while its overrejection bias in the mean persistence scenario is unacceptably large. Finally, under a stylized alternative model of the US interest rates, the overidentification test of EMM obtains the ultimate power for detecting misspecification, while the GMM J-test is increasingly biased downward in finite samples.

Full paper (826 KB Postscript)

Keywords: Monte Carlo study, efficient method of moments, maximum likelihood estimation, square-root diffusion, quasi-maximum likelihood, generalized method of moments

FEDS 2000-44
Credit Scoring and Mortgage Securitization: Do They Lower Mortgage Rates?

Andrea Heuson, Wayne Passmore, and Roger Sparks


This paper develops a model of the interactions between borrowers, originators, and a securitizer in primary and secondary mortgage markets. In the secondary market, the securitizer adds liquidity and plays a strategic game with mortgage originators. The securitizer sets the price at which it will purchase mortgages and the credit score standard that qualifies a mortgage for purchase. We investigate two potential links between securitization and mortgage rates. First, we analyze whether a portion of the liquidity premium gets passed on to borrowers in the form of a lower mortgage rate. Somewhat surpringly, we find plausible conditions under which securization fails to lower the mortgage rate. Secondly, and consistent with recent empirical results, we derive an inverse correlation between the volume of securitization and mortgage rates. However, the causation is reversed from the standard rendering. In our model, a decline in the mortgage rate causes increased securitization rather than the other way around.

Keywords: Mortgages, mortgage securitization, credit scoring, mortgage rates

FEDS 2000-43
Factor Supplies and Specialization in the World Economy

James Harrigan and Egon Zakrajsek


A core prediction of the Heckscher-Ohlin theory is that countries specialize in goods in which they have a comparative advantage, and that the source of comparative advantage is differences in relative factor supplies. To examine this theory, we use the most extensive dataset available and document the pattern of industrial specialization and factor endowment differences in a broad sample of rich and developing countries over a lengthy period (1970-92). Next, we develop an empirical model of specialization based on factor endowments, allowing for unmeasurable technological differences and estimate it using panel data techniques. In addition to estimating the effects of factor endowments, we also consider an alternative hypothesis that the level of aggregate productivity by itself can explain specialization. Our results clearly show the importance of factor endowments on specialization: relative endowments do matter.

Keywords: Heckscher-Ohlin theory, comparative advantage, relative factor supplies

FEDS 2000-42
Using Treasury STRIPS to Measure the Yield Curve

Brian Sack


Treasury STRIPS derived from coupon payments of notes and bonds provide an effective reading of the zero-coupon yield curve. Among their advantages, coupon STRIPS are zero-coupon securities, have a complete range of maturities, and are fungible, which appears to make the coupon STRIPS yield curve relatively smooth. Yields on coupon STRIPS are compared to the zero-coupon yield curves derived from notes and bonds under the Nelson-Siegel and the Fisher-Nychka-Zervos methods. The results point to some shortcomings of these approaches and indicate that the zero-coupon yield curve could be estimated more precisely from coupon STRIPS.

Keywords: Treasury STRIPS, Yield Curve, Treasury market

FEDS 2000-41
An Analysis of European Banks SND Issues and Its Implications for the Design of a Mandatory Subordinated Debt Policy

Andrea Sironi


During the last twenty years an increasing number of proposals to improve bank market discipline through the introduction of a mandatory subordinated debt policy have been drafted and critically discussed by academic economists and bank regulators. While theoretical issues are key in this debate, a proper understanding of the market of banks' subordinated notes and debentures (SND) and of the securities main features is also considered as relevant for the potential introduction, design, and goals setting of such a policy. This paper builds on information concerning issuers, investors, markets, and securities technical features to critically discuss these aspects. Data on over 1,800 European banks' SND issues completed during the 1988-2000 period together with information on primary and secondary market functioning is presented.

Keywords: Capital regulation, bank, rating, subordinated debt

FEDS 2000-40
Testing for Market Discipline in the European Banking Industry: Evidence from Subordinated Debt Issues

Andrea Sironi


The question of whether private investors can rationally discriminate between the risk taken by banks is empirically investigated by testing the risk sensitivity of European banks' subordinated notes and debentures (SND) spreads. A unique dataset of issuance spreads, issues and issuers rating, accounting and market measures of bank risk is used for a sample of European banks' SND issued during the 1991-2000:Q1 period. Moody's Bank Financial Strength (MBFS) and FitchIBCA Individual (FII) ratings are used as proxies of banks risk and found to perform better than accounting variables in explaining the cross-sectional variability of spreads. Empirical results support the hypothesis that SND investors are sensitive to bank risk. An exception to this conclusion is represented by SND issued by public banks, i.e. government owned or guaranteed institutions such as the German Landesbank. Results also show that market discipline on European banks has been improving during the nineties, with the risk sensitivity of SND spreads increasing from the first to the second half of the decade.

Keywords: Market discipline, bank, rating, subordinated debt

FEDS 2000-39
Did U.S. Bank Supervisors Get Tougher During the Credit Crunch? Did They Get Easier During the Banking Boom? Did It Matter to Bank Lending?

Allen N. Berger, Margaret K. Kyle, and Joseph M. Scalise


We test three hypotheses regarding changes in supervisory "toughness" and their effects on bank lending. The data provide modest support for all three hypotheses that there was an increase in toughness during the credit crunch period (1989-1992), that there was a decline in toughness during the boom period (1993-1998), and that changes in toughness, if they occurred, affected bank lending. However, all of the measured effects are small, with 1% or less of loans receiving harsher or easier classification, about 3% of banks receiving better or worse CAMEL ratings, and bank lending being changed by 1% or less of assets.

Full paper (2367 KB Postscript)

Keywords: Bank, lending, supervision, regulation, credit crunch

FEDS 2000-38
Do Minimum Wages Raise the NAIRU?

Peter Tulip


A high minimum wage (relative to average wages) raises nominal wage growth and hence inflation. This effect can be offset by extra unemployment; so the minimum wage increases the Non-Accelerating Inflation Rate of Unemployment or NAIRU. This effect is clearly discernible and robust to variations in model specification and sample period. It is consistent with international comparisons and the behavior of prices. I estimate that the reduction in the relative level of the minimum wage over the last two decades accounts for a reduction in the NAIRU of about 1 1/2 percentage points. It can also account for the substantial reduction in the NAIRU in the United States relative to continental Europe.

Keywords: Minimum wage, NAIRU, unemployment

FEDS 2000-37
Efficiency Barriers to the Consolidation of the European Financial Services Industry

Allen N. Berger, Robert DeYoung, and Gregory F. Udell


Cross-border consolidation of financial institutions within Europe has been relatively limited, possibly reflecting efficiency barriers to operating across borders, including distance; differences in language, culture, currency, and regulatory/supervisory structures; and explicit or implicit rules against foreign competitors. EU policies such as the Single Market Programme and the European Monetary Union attenuate some but not all of these barriers. The evidence is consistent with the hypothesis that these barriers offset most of any potential efficiency gains from cross-border consolidation. Banks headquartered in other EU nations have slightly lower average measured efficiency than domestic banks and non-EU-based foreign banks.

Full paper (734 KB Postscript)

Keywords: Banks, mergers, efficiency, Europe, financial institutions

FEDS 2000-36
The Integration of the Financial Services Industry: Where are the Efficiencies?

Allen N. Berger


We examine the efficiency effects of the integration of the financial services industry and suggest directions for future research. We also propose a relatively broad working definition of integration and employ U.S. and European data on financial service industry M&As to illustrate several types of integration. The analysis suggests that there is a large potential for efficiency gains from integration, but only a relatively small part of this potential may be realized. Integration appears to bring about larger revenue efficiency gains than cost efficiency gains, and most of the gains appear to be linked to benefits from risk diversification.

Full paper (1601 KB Postscript)

Keywords: Banks, insurance, securities firms, mergers, efficiency, international finance

FEDS 2000-35
A Guide to the Use of Chain Aggregated NIPA Data

Karl Whelan


In 1996, the U.S. Department of Commerce began using a new method to construct all aggregate "real" series in the National Income and Product Accounts (NIPA). This method employs the so-called "ideal chain index" pioneered by Irving Fisher. The new methodology has some extremely important implications that are unfamiliar to many practicing empirical economists; as a result, mistaken calculations with NIPA data have become very common. This paper explains the motivation for the switch to chain aggregation and then illustrates the usage of chain-aggregated data with three topical examples, each relating to a different aspect of how information technologies are changing the economy.

Keywords: NIPA Data, chain aggregation, information technologies

FEDS 2000-34
A Quantitative Defense of Stabilization Policy

Darrel Cohen


In an analysis of the value of growth and stabilization of consumption, Robert Lucas presents a stunning set of calculations implying that a permanent increase in the growth rate of consumption of only one-tenth percentage point per year is worth nearly 50 times as much to consumers as complete elimination of consumption variability. This is because the higher growth of consumption is worth a lot while the reduced variability is worth virtually nothing (at least in the post-war United States). Taken at face value, such a result supports the pursuit of feasible growth policies but calls into serious question the study and practice of macroeconomic stabilization policy even if complete elimination of variance were feasible and costless. Primarily by considering alternative meanings of stabilization, this paper establishes that the value of stabilization relative to the value of higher growth is about 100 times larger than the corresponding figure in Lucas. The new quantitative estimates suggest, assuming feasibility, that even a small permanent increase in the growth rate of consumption is worth a lot, but so too is stabilization in the alternative senses considered here.

Full paper (1011 KB Postscript)

Keywords: Stabilization policy, economic growth, truncated distributions

FEDS 2000-33
Deriving Inflation Expectations from Nominal and Inflation-Indexed Treasury Yields

Brian Sack


This paper derives a measure of inflation compensation from the yields of a Treasury inflation-indexed security and a portfolio of STRIPS that has similar liquidity and duration as the indexed security. This measure can be used as a proxy for inflation expectations if the inflation risk premium is small. The calculated measure suggests that the rate of inflation expected over the next ten years fell from just under 3% in mid-1997 to just under 1 3/4% by early 1999, before rising back to about 2 1/2% by the beginning of 2000. This variation is more extensive than would have been expected from a simple model of inflation dynamics or from a survey measure of long-run inflation expectations.

Keywords: Inflation-indexed debt; inflation expectations; treasury market

FEDS 2000-32
On Signal Extraction and Non-Certainty-Equivalence in Optimal Monetary Policy Rules

Eric T. Swanson


A standard result in the literature on monetary policy rules is that of certainty equivalence: given the expected values of all the state variables of the economy, policy should be set in a way that is independent of all higher moments of those variables. Some exceptions to this rule have been pointed out by Smets (1998), who restricts policy to respond to only a limited subset of state variables, and by Orphanides (1998), who restricts policy to respond to estimates of the state variables that are biased. In contrast, this paper studies unrestricted, fully optimal policy rules with optimal estimation of state variables. The rules in this framework exhibit certainty equivalence with respect to estimates of an unobserved, possibly complicated, state of the economy X, but are not certainty-equivalent when 1) a signal-extraction problem is involved in the estimation of X, and 2) the optimal rule is expressed as a reduced form that combines policymakers' estimation and policy-setting stages. In general, I show that it is optimal for policymakers to attenuate their reaction coefficient on a variable about which uncertainty has increased, while responding more aggressively to all other variables, about which uncertainty hasn't changed.

Full paper (325 KB Postscript)

Keywords: Signal extraction, certainty equivalence, monetary policy rules, Taylor rule

FEDS 2000-31
Have the Doors Opened Wider? Trends in Homeownership by Race and Income

Raphael W. Bostic and Brian J. Surette


Homeownership among U.S. families increased notably in recent years, from 63.9% in 1989 to 66.2% in 1998. This paper examines this trend and the factors contributing to it. We find that (1) homeownership has risen for all racial, ethnic, and income groups, (2) the differences in homeownership between minority and non-minority families and between middle- income and lower-income families declined significantly, and (3) changes in family-related characteristics explain homeownership trends among only the top two income quintiles. Among the lower two income quintiles, family-related characteristics explain almost none of the increase in homeownership. This pattern of results suggests that changes in mortgage and housing markets and changes in the regulations that govern those markets, such as CRA and HMDA, account for the increase in homeownership among lower-income families.

Full paper (1048 KB Postscript)

Keywords: Homeownership, community reinvestment act, CRA, home mortgage disclosure act, HMDA, credit scoring

FEDS 2000-30
A Real Options Approach to Housing Investment

Chris Downing and Nancy Wallace


In this paper, we study investments by existing homeowners to improve their homes. The value of a house is modeled as the expected net present value of a perpetual stream of service flows emanating from the attributes of the house. An important innovation in our model is that the set of house attributes evolves over time according to the investment decisions of the homeowner. The homeowner's decisions to invest in house attributes are modeled as real options. Our model of investment embeds a multi-factor term structure model and a general model of the evolution of service flows. We employ numeric simulations to explore the properties of the investment model, and to motivate our empirical test of the model. Using a panel from the American Housing Survey, we test two implications of the real option theory. We test whether investment is more likely when the spread between the return to housing and the cost of capital is wide, and we test whether greater spread volatility depresses investment. The results indicate that homeowner investment behavior is consistent with the theory, even after controlling for business cycle, aging, tenure and for-sale influences.

Full paper (651 KB Postscript)

Keywords: Housing, investment, real option, empirical, theory

FEDS 2000-29
Corporate Share Repurchases in the 1990s: What Role Do Stock Options Play?

Scott Weisbenner


This paper investigates how the growth of stock option programs has affected corporate payout policy. Given that earnings per share (EPS) is widely used in equity valuation, some corporations may opt to repurchase shares to avoid the dilution of EPS that results from past stock option grants. Executives may also prefer distributing cash by repurchasing shares or retaining more earnings, as opposed to increasing dividends, to enhance the value of their own stock options. This paper tests the importance of these two hypotheses using cross-sectional and panel data on stock option programs. I find that stock options granted to top executives affect payout policy differently than do stock options granted to other employees. Option grants in general are associated with increased share repurchases and increased total payouts. However, the larger is the executives' holding of stock options, the more apt the firm is to retain more earnings and curtail cash distributions. Analysis of panel data for a sample of large firms suggests that firms conduct an ongoing repurchase of shares over the life of an option that undoes much of the dilution to EPS that results from past stock option grants.

Keywords: Share repurchase, stock option, payout policy

FEDS 2000-28
Robust Monetary Policy with Misspecified Models: Does Model Uncertainty Always Call for Attenuated Policy?

Robert J. Tetlow and Peter von zur Muehlen


This paper explores Knightian model uncertainty as a possible explanation of the considerable difference between estimated interest rate rules and optimal feedback descriptions of monetary policy. We focus on two types of uncertainty: (i) unstructured model uncertainty reflected in additive shock error processes that result from omitted-variable misspecifications, and (ii) structured model uncertainty, where one or more parameters are identified as the source of misspecification. For an estimated forward-looking model of the U.S. economy, we find that rules that are robust against uncertainty, the nature of which is unspecifiable, or against one-time parametric shifts, are more aggressive than the optimal linear quadratic rule. However, policies designed to protect the economy against the worst-case consequences of misspecified dynamics are less aggressive and turn out to be good approximations of the estimated rule. A possible drawback of such policies is that the losses incurred from protecting against worst-case scenarios are concentrated among the same business cycle frequencies that normally occupy the attention of policymakers.

Keywords: Model uncertainty, robust control, monetary policy, Stackelberg games

FEDS 2000-27
Has Compensation Become More Flexible?

Sandra A. Cannon, Bruce C. Fallick, Michael Lettau, and Raven Saks


In recent years, numerous observers have argued that global competition, increased reliance on contingent workers, and the breakdown of implicit contracts have made compensation practices in the United States more flexible; in particular, employers have become more concerned with how an employee's pay compares to that in other firms and less concerned with considerations of equity or relative pay within the firm. This paper uses establishment-level data from the Bureau of Labor Statistics' Employment Cost Index program to examine this claim by asking whether the variances of compensation within and between establishments have moved in a more "flexible" direction over the 1980s and 1990s. We find evidence consistent with increased flexibility.

Keywords: Wage inequality, wage compression, pay equity

FEDS 2000-26
Household Portfolios in the United States

Carol Bertaut and Martha Starr-McCluer


This paper investigates the composition of households' assets and liabilities in the United States. Using aggregate and survey data, we document major trends in household portfolios in the past 15 years. We show that, despite the broad array of financial products available, the portfolio of the typical household remains fairly simple and safe, consisting of a checking account, savings account, and tax-deferred retirement account; in 1998, less than half of all households owned some form of stock. We use pooled data from the Survey of Consumer Finances to investigate determinants of portfolio choice, finding significant effects of age, wealth, income risk, and entry/information costs.

Keywords: Household portfolios, portfolio choice, household wealth

FEDS 2000-25
Improving Grid-Based Methods for Estimating Value at Risk of Fixed-Income Portfolios

Michael S. Gibson and Matthew Pritsker


Jamshidian and Zhu (1997) propose a discrete grid method for simplifying the computation of Value at Risk (VaR) for fixed-income portfolios. Their method relies on two simplifications. First, the value of fixed income instruments is modeled as depending on a small number of risk factors chosen using principal components analysis. Second, they use a discrete approximation to the distribution of the portfolio's value. We show that their method has two serious shortcomings which imply it cannot accurately estimate VaR for some fixed-income portfolios. First, risk factors chosen using principal components analysis will explain the variation in the yield curve, but they may not explain the variation in the portfolio's value. This will be especially problematic for portfolios that are hedged. Second, their discrete distribution of portfolio value can be a poor approximation to the true continuous distribution. We propose two refinements to their method to correct these two shortcomings. First, we propose choosing risk factors according to their ability to explain the portfolio's value. To do this, instead of generating risk factors with principal components analysis, we generate them with a statistical technique called partial least squares. Second, we compute VaR with a "Grid Monte Carlo" method that uses continuous risk factor distributions while maintaining the computational simplicity of a grid method for pricing. We illustrate our points with several example portfolios where the Jamshidian-Zhu method fails to accurately estimate VaR, while our refinements succeed.

Keywords: Scenario simulation, principal components, partial least squares, Monte Carlo

FEDS 2000-24
Microeconomic Inventory Adjustment: Evidence From U.S. Firm-Level Data

Egon Zakrajsek and Jonathan McCarthy


We examine inventory adjustment in the U.S. manufacturing sector using quarterly firm-level data over the period 1978-97. Our evidence indicates that the inventory investment process is nonlinear and asymmetric, results consistent with a nonconvex adjustment cost structure. The inventory adjustment process differs over the business cycle: for a given level of excess inventories, firms disinvest more in recessions than they do in expansions. The inventory adjustment process has changed little between the 1980s and 1990s, suggesting that recent advances in inventory control have had little effect on adjustment costs. Nevertheless, the optimal inventory-sales ratio in the durable goods sector has declined significantly during our sample period.

Full paper (892 KB Postscript)

Keywords: Inventories, (s,S) inventory policies, linear-quadratic model, business cycles

FEDS 2000-23
Assessing the Productivity of Public Capital with a Locational Equilibrium Model


This paper employs Roback's locational-equilibrium model of public-goods pricing, cross-sectional data from the Census of Population and Housing, and SMSA-level estimates of public capital stocks in order to examine the productive contribution of public capital. I find that public capital has a small positive impact on private output.

Keywords: Public capital, infrastructure, productivity

FEDS 2000-22
Purchasing Power Parity: Three Stakes Through the Heart of the Unit Root Null

Egon Zakrajsek and Matthew Higgins


We provide a comprehensive analysis of the purchasing power parity hypothesis, relying on a linear panel data framework. First, we consider two panel unit root tests, based on transformations of country-specific statistics, which allow for parameter heterogeneity across countries. Using GLS techniques, we modify the two tests to eliminate the upward size distortion induced by cross-sectional dependence among contemporaneous real exchange rate innovations. Second, we consider two tests based on a fixed-effects specification: these tests allow for cross-sectional dependence but impose parameter homogeneity. Three of the four tests provide emphatic support for real exchange rate stationary during the post-Bretton Woods era among relatively open economies. Monte Carlo experiments indicate that the three tests have considerable power against the unit root null. One test allowing parameter heterogeneity provides mixed support for stationarity, but has only limited power against the null.

Keywords: Purchasing power parity, panel data unit root tests

FEDS 2000-21
Does Multinationality Matter? Evidence of Value Destruction in U.S. Multinational Corporations

Reid W. Click and Paul Harrison


We document that capital markets penalize corporate multinationality by putting a lower value on the equity of multinational corporations than on otherwise similar domestic corporations. Using Tobin's q, the multinational discount is estimated to be in the range of 8.6% to 17.1%. The most important mechanism of value destruction is an asset channel in which multinationals have disproportionately high levels of assets in relation to the earnings they generate. Foreign assets are particularly associated with value destruction. In contrast, exporting from U.S. operations is associated with an export premium -- of approximately 3.9% -- resulting from both a higher market value and a lower asset size. Given these findings, we ask why firms become multinationals. Evidence reveals that the portion of a firm owned by management is inversely related to the likelihood that the firm is a multinational, so we conclude that managers who do not own much of the firm may be building multinational empires for private gains at the expense of the shareholders.

Keywords: Multinationality, Tobin's q, foreign assets

FEDS 2000-20
The Resurgence of Growth in the Late 1990s: Is Information Technology the Story?

Stephen D. Oliner and Daniel E. Sichel


The performance of the U.S. economy over the past several years has been remarkable, including a rebound in labor productivity growth after nearly a quarter century of sluggish gains. To assess the role of information technology in the recent rebound, this paper re-examines the growth contribution of computers and related inputs with the same neoclassical framework that we have used in earlier work. Our results indicate that the contribution to productivity growth from the use of information technology -- including computer hardware, software, and communication equipment -- surged in the second half of the 1990s. In addition, technological advance in the production of computers appears to have contributed importantly to the speed-up in productivity growth. All in all, we estimate that the use of information technology and the production of computers accounted for about two-thirds of the 1 percentage point step-up in productivity growth between the first and second halves of the decade. Thus, to answer the question posed in the title of this paper, information technology largely is the story.

Keywords: Growth, productivity, computers, information technology, investment

FEDS 2000-19
Understanding Productivity: Lessons from Longitudinal Microdata

Mark E. Doms and Eric J. Bartelsman


This paper reviews research that uses longitudinal microdata to document productivity movements and to examine factors behind productivity growth. The research explores the dispersion of productivity across firms and establishments, the persistence of productivity differentials, the consequences of entry and exit, and the contribution of resource reallocation across firms to aggregate productivity growth. The research also reveals important factors correlated with productivity growth, such as managerial ability, technology use, human capital, and regulation. The more advanced literature in the field has begun to address the more difficult questions of the causality between these factors and productivity growth.

Full paper (1549 KB Postscript)

Keywords: Productivity, microdata

FEDS 2000-18
Dimensions of Credit Risk and Their Relationship to Economic Capital Requirements


Now in prospect is a major revision of international bank capital regulations that would embody recent advances in credit risk measurement and management. Previous regulations have been simpler in structure, with a primary goal of getting capital requirements right on average, and thus have largely ignored the difference between average and marginal. This paper presents evidence that explicit treatment in new regulations of several important dimensions of credit risk is necessary to limit banks' incentives to engage in capital arbitrage activities. Such activities, if unchecked, may lead to an increase in bank failure rates over time.

Keywords: Risk management, credit risk, bank regulation, capital requirements

FEDS 2000-17
Do Firms Share Their Success with Workers? The Response of Wages to Product Market Conditions

Marcello Estevao and Stacey Tevlin


We provide strong new evidence that industry financial conditions play an important role in wage determination in the U.S. manufacturing sector. Ordinary least squares estimates of the effect of rents per worker on wages are positive and significant, but quite small. However, using two standard bargaining models, we illustrate that this may stem from a variety of econometric difficulties that plague the OLS estimates. In this paper, we are able to overcome these issues and identify the effects of the industry financial situation on wages. We do this using the U.S. input-output tables to isolate exogenous variation in an industry's product market conditions. Our instrumental variable estimates reveal a substantial amount of rent sharing in U.S. manufacturing--much more than is consistent with a purely competitive labor market.

Full paper (378 KB Postscript)

Keywords: Wages, rent-sharing, profit-sharing

FEDS 2000-16
Heterogeneous Forecasts and Aggregate Dynamics

Antulio N. Bomfim


Motivated by issues raised in both the finance and economics literatures, I construct a dynamic general equilibrium model where agents use differing degrees of sophistication when forecasting future economic conditions. All agents solve standard dynamic optimization problems and face strategic complementarity in production, but some solve their inference problems based on simple forecasting rules of thumb. Assuming a hierarchical information structure similar to the one in Townsend's (1983) model of informationally dispersed markets, I show that even a minority of rule-of-thumb forecasters can have a significant effect on the aggregate properties of the economy. For instance, as agents try to forecast each others' behavior they effectively strengthen the internal propagation mechanism of the economy. The quantitative results are obtained by calibrating the model and running a battery of sensitivity tests on key parameters. The analysis highlights the role of strategic complementarity in the heterogeneous expectations literature and quantifies many qualitative claims about the aggregate implications of expectational heterogeneity.

Full paper (228 KB Postscript)

Keywords: Expectations, business cycles, strategic complementarity, propagation, bounded rationality

FEDS 2000-15
Anatomy of a Fair Lending Exam: The Uses and Limitations of Statistics

Paul S. Calem and Stanley D. Longhofer


In this paper, we consider the role of statistical analysis in fair lending compliance examinations. We present a case study of an actual examination of a large mortgage lender, demonstrating how statistical techniques can be a valuable tool focusing examiner efforts to either uncover illegal discrimination or exonerate an institution so accused. Importantly, our case also highlights the limitations of such statistical techniques. The study suggests that statistical analysis combined with comparative file review offers a balanced and thorough approach to enforcement of fair lending laws.

Full paper (1217 KB Postscript)

Keywords: Banks, other depository institutions, mortgages

FEDS 2000-14
Making News: Financial Market Effects of Federal Reserve Disclosure Practices

Antulio N. Bomfim and Vincent R. Reinhart


As recently as early 1994, market participants had to infer the stance of U.S. monetary policy according to the type and size of the open market operations conducted by the Federal Reserve's Trading Desk. Thus, investors were exposed to uncertainty about both the timing and the motivation for monetary policy actions. Since then, changes in disclosure practices regarding monetary policy decisions have potentially mitigated both types of uncertainty. We examine the effects of the greater openness and transparency of these new practices on the way a wide array of financial market instruments respond to unanticipated policy decisions. In general, the financial markets' response to policy does not seem to be related to what the Federal Reserve says after a surprise decision is announced or to when it decides to act. The invariance of the response of asset prices to policy across time and announcement regimes suggests that what the Federal Reserve says when it acts is of second-order importance to the act itself.

Keywords: Monetary policy transparency, expectations, asset prices

FEDS 2000-13
Activist Stabilization Policy and Inflation: The Taylor Rule in the 1970s

Athanasios Orphanides


A number of recent studies have suggested that activist stabilization policy rules responding to inflation and the output gap can attain simultaneously a low and stable rate of inflation as well as a high degree of economic stability. The foremost example of such a strategy is the policy rule proposed by Taylor (1993). In this paper, I demonstrate that the policy settings that would have been suggested by this rule during the 1970s, based on real-time data published by the U.S. Commerce Department, do not greatly differ from actual policy during this period. To the extent macroeconomic outcomes during this period are considered unfavorable, this raises questions regarding the usefulness of this strategy for monetary policy. To the extent the Taylor rule is believed to provide a reasonable guide to monetary policy, this finding raises questions regarding earlier critiques of monetary policy during the 1970s.

Full paper (1122 KB Postscript)

Keywords: Great Inflation, Taylor rule, output gap, real-time data

FEDS 2000-12
The Growth of Consumer Credit and the Household Debt Service Burden

Dean M. Maki


Household debt is at a record high relative to disposable income. Some analysts are concerned that this unprecedented level of debt might pose a risk to the financial health of American households and ultimately lead them to curtail their spending. In this paper, I summarize some of the relevant facts concerning the growth of consumer credit and the household debt service burden, outline the results of the research that has been conducted in this area, and look at the questions that might be answered with additional research.

Keywords: Consumer credit, household debt, consumption, debt service burden

FEDS 2000-11
Explaining the Investment Boom of the 1990s

Stacey Tevlin and Karl Whelan


Real equipment investment in the United States has boomed in recent years, led by soaring investment in computers. We find that traditional aggregate econometric models completely fail to capture the magnitude of this recent growth--mainly because these models neglect to address two features that are crucial (and unique) to the current investment boom. First, the pace at which firms replace depreciated capital has increased. Second, investment has been more sensitive to the cost of capital. We document that these two features stem from the special behavior of investment in computers and therefore propose a disaggregated approach. This produces an econometric model that successfully explains the 1990s equipment investment boom.

Full paper (855 KB Postscript)

Keywords: Investment, cost of capital, depreciation, computers

FEDS 2000-10
The Timing of Debt Issuance and Rating Migrations: Theory and Evidence

Dan Covitz and Paul Harrison


This paper develops and tests a recursive model of debt issuance and rating migration. We examine a signaling game with firms who have private information about their probability distribution of future rating migration. A key assumption of the model is that rating agencies reveal information over time, creating a recursive information problem, which in turn generates an adverse selection problem in debt issuance similar to that for equity issuance in Myers and Majluf (1984). This adverse selection model predicts that debt issuance provides a negative signal of rating migration, and that the signal strengthens with economic downturns. Another prediction regarding the maturity of debt issuance is that long maturity debt sends a negative signal relative to short maturity debt (Flannery 1986). Using data from 1980 to 1998 on straight bond issuance and Moody's ratings, and controlling for firm and issue-specific factors, we find that debt issuance sends a negative signal of a firm's default probability, and that this signal intensifies with a decline in economic activity and with an increase in debt maturity.

Keywords: Debt Issuance, rating migrations

FEDS 2000-09
Generational Aspects of Medicare

Louise Sheiner and David Cutler


This paper examines the generational aspect of the current Medicare system and some stylized reforms. We find that the rates of return on Medicare for today's workers are higher than those for Social Security and that the Medicare system is shifting a greater share of the burden on future workers than is Social Security. Nonetheless, the rates of return on Medicare, using the Medicare Trustees assumptions, are still not that high--roughly 2 percent for today's youngest workers. But forecasting future Medicare expenditures is quite difficult. Under an alternative higher-cost baseline, which we consider plausible, rates of return for today's youngest workers will exceed 3 percent. Putting Medicare on a sustainable basis by raising the payroll tax or reducing benefits would greatly reduce the rate of return for today's workers. Under the Trustees assumptions, for example, the payroll tax would have to be increased by 2.0 percent of payroll to put the Medicare system in balance in perpetuity. This policy would reduce the rate of return on today's youngest workers to about 1.3 percent.

Keywords: Medicare, generational

FEDS 2000-08
The Effects of Weather on Retail Sales

Martha Starr-McCluer


Monthly fluctuations in consumer spending are often attributed to the weather. This paper presents a model in which weather affects the productivity of time in nonmarket activities (such as shopping or recreation), and so, via time and budget constraints, may induce substitution in spending across goods and over time. Using monthly data on retail sales and weather data from the National Weather Service, I find that unusual weather has a modest but significant role in explaining monthly sales fluctuations. However, lagged effects often offset original effects, so that weather's influence tends to wash out at a quarterly frequency.

Keywords: Consumption, retail sales, weather

FEDS 2000-07
On Identification of Continuous Time Stochastic Processes

Jeremy Berkowitz


In this note we delineate conditions under which continuous time stochastic processes can be identified from discrete data. The identification problem is approached in a novel way. The distribution of the observed stochastic process is expressed as the underlying true distribution, f, transformed by some operator, T. Using a generalization of the Taylor series expansion, the transformed function T f can often be expressed as a linear combination of the original function f. By combining the information across a large number of such transformations, the original measurable function of interest can be recovered.

Keywords: Identification, continuous

FEDS 2000-06
Computers, Obsolescence, and Productivity

Karl Whelan


This paper examines the role that computers have played in boosting U.S. economic growth in recent years. The paper focuses on two effects--the effect of increased productivity in the computer-producing sector and the effect of investments in computing equipment on the productivity of those who use them--and concludes that together they account for almost all of the recent acceleration in U.S. labor productivity. In calculating the computer-usage effect, standard NIPA measures of the capital stock are inappropriate for growth accounting because they do not account for technological obsolescence; this occurs when a machine that is still productive is retired because it is no longer near the technological frontier. Using a theoretical framework that explicitly accounts for technological obsolescence, alternative estimates of the computer capital stock are developed that imply larger effects on growth of computer capital accumulation than are suggested by the NIPA stocks.

Keywords: Computer, productivity, obsolescence

FEDS 2000-05
Stock Prices and Fundamentals in a Production Economy


This paper compares the predictions for the market value of firms from the Gordon growth model with those from a dynamic general equilibrium model of production. The predictions for movements in the market value of firms in response to a decline in the required return or an increase in the growth rate of the economy are quantitatively and qualitatively different across the models. While previous research has illustrated how a drop in the required return or an increase in the growth rate of the economy can explain the runup in equity values in the 1990s in the Gordon growth model, the consideration of production overturns these results and illustrates that auxiliary implications of such shifts in fundamentals, such as a sharp increase in the investment intensity of the economy, are not supported by the data in the late 1990s. This tension between theory and data suggests that the skyrocketing market value of firms in the second half of the 1990s may reflect a degree of irrational exuberance.

Keywords: Asset pricing, investment

FEDS 2000-04
Globalization of Financial Institutions: Evidence from Cross-Border Banking Performance

Allen N. Berger, Robert DeYoung, Hesna Genay, and Gregory F. Udell


We address the causes, consequences, and implications of the cross-border consolidation of financial institutions by reviewing several hundred studies, providing comparative international data, and estimating cross-border banking efficiency in France, Germany, Spain, the U.K., and the U.S. during the 1990s. We find that, on average, domestic banks have higher profit efficiency than foreign banks. However, banks from at least one country (the U.S.) appear to operate with relatively high efficiency both at home and abroad. If these results continue to hold, they do not preclude successful international expansion by some financial firms, but they do suggest limits to global consolidation.

Keywords: Banks, mergers, small businesses, x-efficiency, international finance

FEDS 2000-03
Should America Save for its Old Age? Population Aging, National Saving, and Fiscal Policy

Douglas Elmendorf and Louise Sheiner


While popular wisdom holds that the United States should save more now in anticipation of the aging of the baby boom generation, the optimal response to population aging from a macroeconomic perspective is not clear-cut. Indeed, Cutler, Poterba, Sheiner, and Summers ("CPSS",1990) argued that the optimal response to the coming demographic transition was more likely to be a reduction in national saving than an increase. In this paper we reexamine this question. In particular, we ask how the optimal saving response depends on the openness of our economy, on how we view the consumption of children, and on the existence of pay-as-you-go transfer programs like Social Security and Medicare. We find that, if the United States were a small open economy and world interest rates were fixed at their current level, the desire to smooth consumption as our population aged would lead us to increase saving today. But the optimal response in a closed economy is much less clear-cut, as slower growth of the labor force will push down the rate of return on capita l and diminish desired saving. For reasonable parameters, the optimal response to our aging population in a closed economy is likely to be small--either a small decline in national saving or a small increase. We also explore the role of the government in population aging. Government programs can influence consumption if they affect the capital-labor ratio or the relative weight that society places on the consumption of the elderly.

Keywords: Aging, saving

FEDS 2000-02
Real Wage Dynamics and the Phillips Curve

Karl Whelan


Since Friedman (1968), the traditional derivation of the accelerationist Phillips curve has related expected real wage inflation to the unemployment rate and then invoked markup pricing and adaptive expectations to generate the accelerationist price inflation equation. Blanchflower and Oswald (1994) have argued that microeconomic evidence of a low autoregression coefficient in real wage regressions invalidates this approach, a conclusion that has been disputed widely on the grounds that the true autoregression coefficient is close to one. This paper shows that the accelerationist relationship between the change in price inflation and the unemployment rate is consistent with any type of microeconomic real wage dynamics. However, these dynamics will determine how supply shocks affect inflation. Evidence on supply shocks and inflation points against the traditional real wage formulation. Implications for the recent behavior of the NAIRU are explored.

Keywords: Phillips curve, inflation, NAIRU

FEDS 2000-01
Inflation Targeting and Target Instability


Monetary policy is modeled as governed by a known rule, except for a time-varying target rate of inflation. The variable target is taken as representing either discretionary deviations from the rule, or as the outcome of a policymaking committee that is unable to arrive at a consensus. Stochastic simulations of FRB/US, the Board of Governors' large, rational-expectations model of the U.S. economy, are used to examine the benefits of reducing the variability in the target rate of inflation. We find that putting credible boundaries on target variability introduces an important non-linearity in expectations. This improves policy performance by focusing agents' expectations on policy objectives. But improvements are limited; it does not generally pay to reduce target variability to zero. The non-linearity in expectations can be used to conduct a policy with greater attention to output stabilization than otherwise. The results provide insights as to why inflation-targeting countries use bands and why the bands are narrower than studies suggest they should be. Also, a numerical technique that approximates to arbitrary precision a non-linear process with a linear method is also demonstrated. This greatly speeds the simulations and makes them more robust.

Full paper (453 KB Postscript)

Keywords: Monetary policy, inflation targeting, macroeconomic modeling, computational methods

Back to Top
Last Update: December 01, 2023