Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Reader.
The authors study the implications of internal consumption habit for new Keynesian dynamic stochastic general equilibrium (NKDSGE) models. Bayesian Monte Carlo methods are employed to evaluate NKDSGE model fit. Simulation experiments show that internal consumption habit often improves the ability of NKDSGE models to match the spectra of output and consumption growth. Nonetheless, the fit of NKDSGE models with internal consumption habit is susceptible to the sources of nominal rigidity, to spectra identified by permanent productivity shocks, to the choice of monetary policy rule, and to the frequencies used for evaluation. These vulnerabilities indicate that the specification of NKDSGE models is fragile.
(449 KB, 29 pages)
Around the globe, credit bureaus restrict the length of time that negative credit information can be retained. By exploiting a quasi-experimental variation in retention times of negative credit information, we find that a prolonged retention time increases the need for and access to credit and reduces the likelihood to default. In both regimes, less than 27 percent of individuals default again within two years after removal, suggesting that only a minority is inherently high risk or, alternatively, removal of credit arrears induce borrowers to exert greater effort. Either interpretation raises the possibility that forgetting defaults is welfare enhancing.
(882 KB, 33 pages)
This paper examines the interactions of macroprudential policy and monetary policy in a New Keynesian DSGE model with financial frictions. Macroprudential policy can stabilize credit cycles. However, a macroprudential instrument that aims to stabilize a specific segment of the credit market can cause regulatory arbitrage, that is, a reallocation of credit to a less regulated part of the market. Within this model, welfare-maximizing monetary policy aims to stabilize only inflation and macroprudential policy only stabilizes credit. Two aspects of the model account for this dichotomy. First, credit stabilization is welfare improving because lower volatility is compensated by higher mean equilibrium credit and capital. Second, monetary policy is sub-optimal for credit stabilization. The reason is that it operates on the decisions of borrowers and savers, while macroprudential policy operates only on the decisions of borrowers.
(495 KB, 41 pages)
The authors develop a sequential Monte Carlo (SMC) algorithm for estimating Bayesian dynamic stochastic general equilibrium (DSGE) models, wherein a particle approximation to the posterior is built iteratively through tempering the likelihood. Using three examples consisting of an artifcial state-space model, the Smets and Wouters (2007) model, and Schmitt-Grohé and Uribe’s (2012) news shock model the authors show that the SMC algorithm is better suited for multi-modal and irregular posterior distributions than the widely-used random walk Metropolis-Hastings algorithm. Unlike standard Markov chain Monte Carlo (MCMC) techniques, the SMC algorithm is well suited for parallel computing.
(826 KB, 64 pages)
The authors develop a new dynamic general equilibrium model to explain firm entry, exit, and relocation decisions in an urban economy with multiple locations and agglomeration externalities. They characterize the stationary distribution of firms that arises in equilibrium. They estimate the parameters of the model using a method of moments estimator. Using unique panel data collected by Dun and Bradstreet, the authors find that their model fits the moments used in estimation as well as a set of moments that the authors use for model validation. Agglomeration externalities increase the productivity of firms by about 8 percent. Economic policies that subsidize firm relocations to the central business district increase agglomeration externalities in that area. They also increase economic welfare in the urban economy.
(420 KB, 55 pages)
A tractable production-externality-based circular city model in which both firms and workers choose location as well as intensity of land use is presented. The equilibrium structure of the city has either (i) no commuting ("mixed-use" form) or (ii) a central business district (CBD) of positive radius and a surrounding residential ring. Regardless of which form prevails, the intra-city variation in all endogenous variables displays the negative exponential form: x(r) = x(0)e¯Øxr (where r is the distance from the city center and Øx depends only on preference and technology parameters). An application is presented wherein it is shown that population growth may lead to a smaller increase in land rents in cities that cannot expand physically because these cities are less able to exploit the external effect of greater employment density.
(514 KB, 40 pages)
This paper explores the hypothesis that the sources of economic and financial crises differ from non-crisis business cycle fluctuations. The authors employ Markov-switching Bayesian vector autoregressions (MS-BVARs) to gather evidence about the hypothesis on a long annual U.S. sample running from 1890 to 2010. The sample covers several episodes useful for understanding U.S. economic and financial history, which generate variation in the data that aids in identifying credit supply and demand shocks. They identify these shocks within MS-BVARs by tying credit supply and demand movements to inside money and its intertemporal price. The model space is limited to stochastic volatility (SV) in the errors of the MS-BVARs. Of the 15 MS-BVARs estimated, the data favor a MS-BVAR in which economic and financial crises and non-crisis business cycle regimes recur throughout the long annual sample. The best-fitting MS-BVAR also isolates SV regimes in which shocks to inside money dominate aggregate fluctuations.
(377 KB, 48 pages)
Because of lags in legislating and implementing fiscal policy, private agents can often anticipate future changes in tax policy and government spending before these changes actually occur, a phenomenon referred to as fiscal foresight. Econometric analysis that fails to model fiscal foresight may obtain tax and spending multipliers that are biased. One way researchers have attempted to deal with the problem of fiscal foresight is by examining the narrative history of government revenue and spending news. The Great Recession and efforts by the federal government through the American Recovery and Reinvestment Act of 2009 (ARRA) to stimulate the economy returned fiscal policy, and in particular the role of state and local governments in such policies, to the center of macro-economic policymaking. In a companion paper, the authors use federal grants-in-aid to state and local governments to provide an evaluation of the effectiveness of the ARRA. The purpose of this paper is to develop narrative measures of the federal grants-in-aid programs beginning with the Federal Highway Act of 1956 through the ARRA of 2009. The narrative measures they develop will be used as instruments for federal grants-in-aid in their subsequent analysis of the ARRA.
(537 KB, 34 pages)
The authors study the location of more than 1,000 research and development (R&D) labs located in the Northeast corridor of the U.S. Using a variety of spatial econometric techniques, they find that these labs are substantially more concentrated in space than the underlying distribution of manufacturing activity. Ripley’s K-function tests over a variety of spatial scales reveal that the strongest evidence of concentration occurs at two discrete distances: one at about one-quarter of a mile and another at about 40 miles. They also find that R&D labs in some industries (e.g., chemicals, including drugs) are substantially more spatially concentrated than are R&D labs as a whole. Tests using local K-functions reveal several concentrations of R&D labs that appear to represent research clusters. They verify this conjecture using significance maximizing techniques (e.g., SATSCAN) that also address econometric issues related to “multiple testing” and spatial autocorrelation. The authors develop a new procedure for identifying clusters — the multiscale core-cluster approach, to identify labs that appear to be clustered at a variety of spatial scales. Locations in these clusters are often related to basic infrastructure such as access to major roads. There is significant variation in the industrial composition of labs across these clusters. The clusters the authors identify appear related to knowledge spillovers: Citations to patents previously obtained by inventors residing in clustered areas are significantly more localized than one would predict from a (control) sample of otherwise similar patents.
(7.12 MB, 45 pages)
The authors build a New Keynesian model in which heterogeneous workers differ with regard to their employment status due to search and matching frictions in the labor market, their potential labor income, and their amount of savings. They use this laboratory to quantitatively assess who stands to win or lose from unanticipated monetary accommodation and who benefits most from systematic monetary stabilization policy. They find substantial redistribution effects of monetary policy shocks; a contractionary monetary policy shock increases income and welfare of the wealthiest 5 percent, while the remaining 95 percent experience lower income and welfare. Consequently, the negative effect of a contractionary monetary policy shock to social welfare is larger if heterogeneity is taken into account.
(399 KB, 49 pages)
The authors study empirically and theoretically the growth of U.S. manufacturing exports from 1987 to 2007. They identify the change in iceberg costs with plant-level data on the intensity of exporting by exporters. Given this change in iceberg costs, they find that a GE model with heterogeneous establishments and a sunk cost of starting to export is consistent with both aggregate U.S. export growth and the changes in the number and size of U.S. exporters. The model also captures the non-linear dynamics of U.S. export growth. A model without a sunk export cost generates substantially less trade growth and misses out on the timing of export growth. Contrary to the theory, employment was largely reallocated from very large establishments, those with more than 2,500 employees, toward very small manufacturing establishments, those with fewer than 100 employees. Allowing for faster productivity growth in manufacturing, changes in capital intensity, and some changes in the underlying shock process makes the theory consistent with the changes in the employment size distribution. The authors also find that the contribution of trade to the contraction in U.S. manufacturing employment is small.
Supersedes Working Paper 10-10.
(414 KB, 60 pages)
The author shows that a purely private monetary system is inherently unstable due to the role of endogenous debt limits in the creation of private money. Because people’s ability to issue notes (personal liabilities that circulate as a medium of exchange) depends on beliefs about the exchange value of their notes in future periods, there exist multiple equilibria. Some of these equilibria have undesirable properties: Self-fulfilling panics are possible outcomes. In response to this inherent instability of private money, the author formulates a government intervention that ensures the determinacy of equilibrium. In particular, the author defines an operational procedure for a monetary authority capable of ensuring the stability and efficiency of the monetary system.
(360 KB, 36 pages)
The authors develop an empirical framework for the credit risk analysis of a generic portfolio of revolving credit accounts and apply it to analyze a representative panel data set of credit card accounts from a credit bureau. These data cover the period of the most recent deep recession and provide the opportunity to analyze the performance of such a portfolio under significant economic stress conditions. They consider a traditional framework for the analysis of credit risk where the probability of default (PD), loss given default (LGD), and exposure at default (EAD) are explicitly considered. The unsecure and revolving nature of credit card lending is naturally modeled in this framework. The authors' results indicate that unemployment, and in particular the level and change in unemployment, plays a significant role in the probability of transition across delinquency states in general and the probability of default in particular. The effect is heterogeneous and proportionally has a more significant impact for high credit score and for high-utilization accounts. Their results also indicate that unemployment and a downturn in economic conditions play a quantitatively small, or even irrelevant, role in the changes in account balance associated with changes in an account's delinquency status, and in the exposure at default specifically. The impact of a downturn in economic conditions and, in particular, changes in unemployment on the recovery rate and loss given default is found to be large. These findings are of particular relevance for the analysis of credit risk regulatory capital under the IRB approach within the Basel II capital accord.
(470 KB, 46 pages)
The Innovation Union initiative of the European Union focuses on product and process innovation for tangible goods. The authors argue that it is essential to extend the scope of the initiative to include innovation for financial sector products, processes, and regulatory approaches. They make this argument using examples of financial sector innovations in the United States following the Great Depression and on the basis of an examination of the 2008 financial crisis.
(199 KB, 36 pages)
The large, persistent fluctuations in international trade that can not be explained in standard models by changes in expenditures and relative prices are often attributed to trade wedges. The authors show that these trade wedges can reflect the decisions of importers to change their inventory holdings. They find that a two-country model of international business cycles with an inventory management decision can generate trade flows and wedges consistent with the data. Moreover, matching trade flows alters the international transmission of business cycles. Specifically, real net exports become countercyclical and consumption is less correlated across countries than in standard models. The authors also show that ignoring inventories as a source of trade wedges substantially overstates the role of trade wedges in business cycle fluctuations.
(567 KB, 52 pages)
Participants in student loan programs must repay loans in full regardless of whether they complete college. But many students who take out a loan do not earn a degree (the dropout rate among college students is between 33 to 50 percent). The authors examine whether insurance, in the form of loan forgiveness in the event of failure to complete college, can be offered, taking into account moral hazard and adverse selection. To do so, they develop a model that accounts for college enrollment and graduation rates among recent U.S. high school graduates. In their model students may fail to earn a degree because they either fail college or choose to leave voluntarily. The authors find that if loan forgiveness is offered only when a student fails college, average welfare increases by 2.40 percent (in consumption equivalent units) without much effect on either enrollment or graduation rates. If loan forgiveness is offered against both failure and voluntary departure, welfare increases by 2.15 percent and both enrollment and graduation are higher.
(359 KB, 30 pages)
An important source of inefficiency in long-term debt contracts is the debt dilution problem, wherein a borrower ignores the adverse impact of new borrowing on the market value of outstanding debt and, therefore, borrows too much and defaults too frequently. A commonly proposed remedy to the debt dilution problem is seniority of debt, wherein creditors who lent first are given priority in any bankruptcy or restructuring proceedings. The goal of this paper is to incorporate seniority in a quantitatively realistic, infinite horizon model of sovereign debt and default and examine, both theoretically and quantitatively, the extent to which seniority can mitigate the debt dilution problem.
(523 KB, 39 pages)
Self regulation encouraged by market discipline constitutes a key component of Basel II's third pillar. But high-risk investment strategies may maximize the expected value of some banks. In these cases, does market discipline encourage risk-taking that undermines bank stability in economic downturns? This paper reviews the literature on corporate control in banking. It reviews the techniques for assessing bank performance, interaction between regulation and the federal safety net with market discipline on risk-taking incentives and stability, and sources of market discipline, including ownership structure, capital market discipline, product market competition, labor market competition, boards of directors, and compensation.
(282 KB, 28 pages)
The authors estimate the cost savings to the U.S. payment system resulting from implementing Check 21. This legislation initially permitted a paper substitute digital image of a check, and later an electronic digital image of a check, to be processed and presented for payment on a same-day basis. Check 21 has effectively eliminated the processing and presentment of original paper checks over multiple days. By shifting to electronic collection and presentment, the Federal Reserve reduced its per item check processing costs by over 70 percent, reducing estimated overall payment system costs by $1.16 billion in 2010. In addition, payment collection times and associated float fell dramatically for collecting banks and payees with consequent additional savings in firm working capital costs of perhaps $1.37 billion and consumer benefits of $0.64 billion.
(282 KB, 28 pages)
The authors establish a fundamental relationship between the return on the banking sectors assets and each banker's willingness to supply liabilities that facilitate payments and settlement (private money). In particular, they show that the regulation of lending practices is necessary for the optimal provision of private money. In an environment in which bankers cannot commit to their promises, an unregulated banking sector fails to implement an efficient allocation. The authors show that an intervention that raises the value of the bankers' assets (e.g., by regulating lending practices) will make them willing to offer a higher return on their liabilities. In particular, if the return on their assets is made sufficiently large, then it is possible to implement an efficient allocation with private money.
(396 KB, 41 pages)
Motivated by the recent experience of the U.S. and the Eurozone, the authors describe the quantitative properties of a New Keynesian model with a zero lower bound (ZLB) on nominal interest rates, explicitly accounting for the nonlinearities that the bound brings. Besides showing how such a model can be efficiently computed, the authors found that the behavior of the economy is substantially affected by the presence of the ZLB. In particular, the authors document 1) the unconditional and conditional probabilities of hitting the ZLB; 2) the unconditional and conditional probability distributions of the duration of a spell at the ZLB; 3) the responses of output to government expenditure shocks at the ZLB, 4) the distribution of shocks that send the economy to the ZLB; and 5) the distribution of shocks that keep the economy at the ZLB.
(470 KB, 47 pages)
Economists have tried to uncover stylized facts about people's expectations, testing whether such expectations are rational. Tests in the early 1980s suggested that expectations were biased, and some economists took irrational expectations as a stylized fact. But, over time, the results of tests that led to such a conclusion were reversed. In this paper, the author examines how tests for bias in expectations, measured using the Survey of Professional Forecasters, have changed over time. In addition, key macroeconomic variables that are the subject of forecasts are revised over time, causing problems in determining how to measure the accuracy of forecasts. The results of bias tests are found to depend on the subsample in question, as well as what concept is used to measure the actual value of a macroeconomic variable. Thus, the author's analysis takes place in two dimensions: across subsamples and with alternative measures of realized values of variables.
(1.11 MB, 32 pages)
Superseded by Working Paper 13-14 (517 KB, 52 pages)
This paper is the first to document the presence of a private premium in public bonds. The authors find that spreads are 31 basis points higher for public bonds of private companies than for bonds of public companies, even after controlling for observable differences, including rating, financial performance, industry, bond characteristics and issuance timing. The estimated private premium increases to 40 to 50 basis points when a propensity matching methodology is used or when they control for fixed issuer effects. Despite the premium pricing, bonds of private companies are no more likely to default or be downgraded than are public bonds. They do not have worse secondary market performance or higher CDS spreads nor are they necessarily less liquid. Bond investors appear to discount the value of privately held equity. The effect does not come only from the lack of a public market signal of asset quality, because very small public companies also pay high spreads.
(379 KB, 40 pages)
During the last three decades, the stock of government debt has increased in most developed countries. During the same period, the authors also observe a significant liberalization of international financial markets and an increase in income inequality in several industrialized countries. In this paper they propose a multicountry political economy model with incomplete markets and endogenous government borrowing and show that governments choose higher levels of public debt when financial markets become internationally integrated and inequality increases. The authors also conduct an empirical analysis using OECD data and find that the predictions of the theoretical model are supported by the empirical results.
(708 KB, 53 pages)
This paper incorporates home production into a dynamic general equilibrium model of overlapping generations with endogenous retirement to study Social Security reforms. As such, the model differentiates both consumption goods and labor effort according to their respective roles in home production and market activities. Using a calibrated model, the authors find that eliminating the current pay-as-you-go Social Security system has important implications for both labor supply and consumption decisions and that these decisions are influenced by the presence of a home production technology. Comparing their benchmark economy to one with differentiated goods but no home production, the authors find that eliminating Social Security benefits generates larger welfare gains in the presence of home production. This result is due to the self insurance aspects generated by the presence of home production. Comparing their economy to a one-good economy without home production, the authors show that the welfare gains of eliminating Social Security are magnified even further. These policy analyses suggest the importance of modeling home production and distinguishing between both time use and consumption goods depending on whether they are involved in market or home production.
(481 KB, 38 pages)
The authors survey Bayesian methods for estimating dynamic stochastic general equilibrium (DSGE) models in this article. They focus on New Keynesian (NK)DSGE models because of the interest shown in this class of models by economists in academic and policy-making institutions. This interest stems from the ability of this class of DSGE model to transmit real, nominal, and fiscal and monetary policy shocks into endogenous fluctuations at business cycle frequencies. Intuition about these propagation mechanisms is developed by reviewing the structure of a canonical NKDSGE model. Estimation and evaluation of the NKDSGE model rests on being able to detrend its optimality and equilibrium conditions, to construct a linear approximation of the model, to solve for its linear approximate decision rules, and to map from this solution into a state space model to generate Kalman filter projections. The likelihood of the linear approximate NKDSGE model is based on these projections. The projections and likelihood are useful inputs into the Metropolis-Hastings Markov chain Monte Carlo simulator that the authors employ to produce Bayesian estimates of the NKDSGE model. They discuss an algorithm that implements this simulator. This algorithm involves choosing priors of the NKDSGE model parameters and fixing initial conditions to start the simulator. The output of the simulator is posterior estimates of two NKDSGE models, which are summarized and compared to results in the existing literature. Given the posterior distributions, the NKDSGE models are evaluated with tools that determine which is most favored by the data. The authors also give a short history of DSGE model estimation as well as pointing to issues that are at the frontier of this research.
(269 KB, 30 pages)
Using an estimated dynamic stochastic general equilibrium model, the author shows that shocks to a common international stochastic trend explain on average about 10 percent of the variability of output in several small developed economies. These shocks explain roughly twice as much of the volatility of consumption growth as the volatility of output growth. Country-specific disturbances account for the bulk of the volatility in the data. Substantial heterogeneity in the estimated parameters and stochastic processes translates into a rich array of impulse responses across countries.
(494 KB, 46 pages)
This paper assesses how various approaches to modeling the separation margin affect the quantitative ability of the Mortensen-Pissarides labor matching model. The model with a constant separation rate fails to produce realistic volatility and productivity responsiveness of the separation rate and worker flows. The specification with endogenous separation succeeds along these dimensions. Allowing for on-the-job search enables the model to replicate the Beveridge curve. All specifications, however, fail to generate sufficient volatility of the job finding rate. While adopting the Hagedorn-Manovskii calibration remedies this problem, the volume of job-to-job transitions in the on-the-job search specification becomes essentially zero.
(349 KB, 29 pages)
The authors document that the U.S. non-financial corporate sector became a net lender in the 2000s, using aggregate and firm-level data. They develop a structural model with investment, debt, and equity. Debt is fiscally advantageous but subject to a no-default borrowing constraint. Equity allows the firm to suspend dividends when the cash flow is negative. Firms accumulate financial assets for precautionary reasons, yet value equity as partial insurance against shocks. The calibrated model replicates the prevalence of net savings in the period 2000-2007 and attributes the rise in corporate savings over the past 40 years to lower dividend taxes.
(534 KB, 48 pages)