> > > >
Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Reader.
Residential house price indexes (HPI) are used for a large variety of macroeconomic and microeconomic research and policy purposes, as well as for automated valuation models. As is well known, these indexes are subject to substantial revisions in the months following the initial release, both because transaction data can be slow to come in, and as a consequence of the repeat sales methodology, which interpolates the effect of sales over the entire period since the house last changed hands. The authors study the properties of the revisions to the CoreLogic House Price Index. This index is used both by researchers and in the Financial Accounts of the United States to compute the value of residential real estate. The authors show that the magnitude of revisions to this index can be significant: At the national level, the ratio of standard deviation of monthly revisions to the growth rate of the index, relative to the standard deviation of the growth rate in the index, is 29%, which is comparable to the relative ratio for other macroeconomic series. The revisions are also economically significant and impact measures used by policymakers: Revisions over the first 12 releases of the index reduce estimates of the fraction of borrowers nationwide with negative equity by 4.3%, corresponding to 423,000 households. Lastly, the authors find that revisions are ex-ante predictable: Both past revisions and past house price appreciation are negatively correlated with future revisions.
(556 KB, 18 pages)
This paper develops a dynamic theory of money and banking that explains why banks need to hold an illiquid portfolio to provide socially optimal transaction and liquidity services, opening the door to the possibility of equilibrium banking panics. Following a widespread liquidation of banking assets in the event of a panic, the banking portfolio consistent with the optimal provision of transaction and liquidity services during normal times cannot be quickly reestablished, resulting in an unusual loss of wealth for all depositors. This negative wealth effect stemming from the liquid portion of the consumers' portfolio is strong enough to produce a protracted recession. A key element of the theory is the existence of a dynamic interaction between the ability of banks to offer transaction and liquidity services and the occurrence of panics.
(370 KB, 55 pages)
The financial crisis has generated fundamental reforms in the financial regulatory system in the U.S. and internationally. Much of this reform was in direct response to the weaknesses revealed in the precrisis system. The new “macroprudential” approach to financial regulations focuses on risks arising in financial markets broadly, as well as the potential impact on the financial system that may arise from financial distress at systemically important financial institutions. Systemic risk is the key factor in financial stability, but our current understanding of systemic risk is rather limited. While the goal of using regulation to maintain financial stability is clear, it is not obvious how to design an effective regulatory framework that achieves the financial stability objective while also promoting financial innovations. This paper discusses academic research and expert opinions on this vital subject of financial stability and regulatory reforms. Specifically, among other issues, it discusses the impact of increasing public disclosure of supervisory information, the effectiveness of bank stress testing as a tool to enhance financial stability, whether the financial crisis was caused by too big to fail (TBTF), and whether the Dodd-Frank Wall Street Reform and Consumer Protection Act (DFA) resolution regime would be effective in achieving financial stability and ending TBTF.
(304 KB, 26 pages)
The authors assess the credit market impact of mortgage “strip-down” — reducing the principal of underwater residential mortgages to the current market value of the property for homeowners in Chapter 7 or Chapter 13 bankruptcy. Strip-down of mortgages in bankruptcy was proposed as a means of reducing foreclosures during the recent mortgage crisis but was blocked by lenders. The authors’ goal is to determine whether allowing bankruptcy judges to modify mortgages would have a large adverse impact on new mortgage applicants. Their identification is provided by a series of U.S. Court of Appeals decisions during the late 1980s and early 1990s that introduced mortgage strip-down under both bankruptcy chapters in parts of the U.S., followed by two Supreme Court rulings that abolished it throughout the U.S. The authors find that the Supreme Court decision to abolish mortgage strip-down under Chapter 13 led to a reduction of 3% in mortgage interest rates and an increase of 1% in mortgage approval rates, while the Supreme Court decision to abolish strip-down under Chapter 7 led to a reduction of 2% in approval rates and no change in interest rates. The authors also find that markets react less to circuit court decisions than to Supreme Court decisions. Overall, the authors’ results suggest that lenders respond to forced renegotiation of contracts in bankruptcy, but their responses are small and not always in the predicted direction. The lack of systematic patterns evident in the authors’ results suggests that introducing mortgage strip-down under either bankruptcy chapter would not have strong adverse effects on mortgage loan terms and could be a useful new policy tool to reduce foreclosures when future housing bubbles burst.
(550 KB, 40 pages)
The authors define a class of bias problems that arise when purchasers shift their expenditures among sellers charging different prices for units of precisely defined and interchangeable product items that are nevertheless regarded as different for the purposes of price measurement. For business-to-business transactions, these shifts can cause sourcing substitution bias in the Producer Price Index (PPI) and the Import Price Index (MPI), as well as potentially in the proposed new true Input Price Index (IPI). Similarly, when consumers shift their expenditures for the same products temporally to take advantage of promotional sales or among retailers charging different per unit prices, this can cause a promotions bias problem in the Consumer Price Index (CPI) or a CPI outlet substitution bias. The authors recommend alternatives to conventional price indexes that make use of unit values over precisely defined and interchangeable product items. They argue that our proposed ideal target indexes could greatly reduce these biases and make use of increasingly available electronic scanner data on prices and quantities. The authors also address the challenges national statistics agencies must surmount to produce price index measures more like the specified target ones.
(552 KB, 56 pages)
The authors build a structural model of Chapter 13 bankruptcy that captures salient features of personal bankruptcy under Chapter 13. The authors estimate their model using a novel data set they construct from bankruptcy court dockets recorded in Delaware between 2001 and 2002. The authors’ estimation results highlight the importance of debtor’s choice of repayment plan length on Chapter 13 outcomes under the restrictions imposed by the bankruptcy law. The authors use the estimated model to conduct policy experiments to evaluate the impact of more stringent provisions of Chapter 13 that impose additional restrictions on the length of repayment plans. The authors find that these provisions would not materially affect creditor recovery rates and would not necessarily make discharge more likely for debtors with income above the state median income.
Supersedes Working Paper 07-31.
(967 KB, 40 pages)
The share of high-skilled workers in U.S. cities is positively correlated with city size, and
this correlation strengthened between 1980 and 2010. Furthermore, during the same time
period, the U.S. economy experienced a significant structural transformation with regard to
industrial composition, most notably in the decline of manufacturing and the rise of high-
skilled service industries. To decompose and investigate these trends, this paper develops
and estimates a spatial equilibrium model with heterogeneous firms and workers that allows
for both industry-specific and skill-specific technology changes across cities. The estimates
imply that both supply and demand of high-skilled labor have increased over time in big
cities. In addition, demand for skilled labor in large cities has increased somewhat within all
industries. However, this aggregate increase in skill demand in cities is highly concentrated
in a few industries. The finance, insurance, and real estate sectors alone account for 35
percent of the net change over time.
(663 MB, 40 pages)
The authors ask two questions related to how access to credit affects the nature of business cycles.
First, does the standard theory of unsecured credit account for the high volatility and procyclicality of credit and the high volatility and countercyclicality of bankruptcy filings found in U.S.
data? Yes, it does, but only if we explicitly model recessions as displaying countercyclical earnings
risk (i.e., rather than having all households fare slightly worse than normal during recessions, we
ensure that more households than normal fare very poorly). Second, does access to credit smooth
aggregate consumption or aggregate hours worked, and if so, does it matter with respect to the
nature of business cycles? No, it does not; in fact, consumption is 20 percent more volatile when
credit is available. The interest rate premia increase in recessions because of higher bankruptcy
risk discouraging households from using credit. This finding contradicts the intuition that access
to credit helps households to smooth their consumption.
(531 MB, 44 pages)
The extent and direction of causation between micro volatility and business cycles are debated. The authors examine, empirically and theoretically, the source and effects of fluctuations in the dispersion of producer-level sales and production over the business cycle. On the theoretical side, the authors study the effect of exogenous first- and second-moment shocks to producer-level productivity in a two-country DSGE model with heterogeneous producers and an endogenous dynamic export participation decision. First-moment shocks cause endogenous fluctuations in producer-level dispersion by reallocating production internationally, while second-moment shocks lead to increases in trade relative to GDP in recessions. Empirically, using detailed product-level data in the motor vehicle industry and industry-level data of U.S. manufacturers, the authors find evidence that international reallocation is indeed important for understanding cross-industry variation in cyclical patterns of measured dispersion.
(490 MB, 43 pages)
The authors use a structural dynamic stochastic general equilibrium model to investigate how initial data releases of key macroeconomic aggregates are related to final revised versions and how identified aggregate shocks influence data revisions. The analysis sheds light on how well preliminary data approximate final data and on how policymakers might condition their view of the preliminary data when formulating policy actions. The results suggest that monetary policy shocks and multifactor productivity shocks lead to predictable revisions to the initial release data on output growth and inflation.
(590 MB, 38 pages)
This paper examines how instances of identity theft that are sufficiently severe to induce consumers to place an extended fraud alert in their credit reports affect their risk scores, delinquencies, and other credit bureau variables on impact and thereafter. We show that for many consumers these effects are relatively small and transitory. However, for a significant number of consumers, especially those with lower risk scores prior to the event, there are more persistent and generally positive effects on credit bureau variables, including risk scores. We argue that these positive changes for subprime consumers are consistent with the effect of increased salience of credit file information to the consumer at the time of the identity theft.
(625 KB, 48 pages)
Reverse mortgage loans (RMLs) allow older homeowners to borrow against housing wealth without moving. Despite growth in this market, only 2.1% of eligible homeowners had RMLs in 2011. In this paper, the authors analyze reverse mortgages in a calibrated life-cycle model of retirement. The average welfare gain from RMLs is $885 per homeowner. The authors’ model implies that low-income, low-wealth, and poor-health households benefit the most, consistent with empirical evidence. Bequest motives, nursing-home-move risk, house price risk, and loan costs all contribute to the low take-up. The Great Recession may lead to increased RML demand, by up to 30% for the lowest-income and oldest households.
Supersedes Working Paper 13-27.
(567 KB, 40 pages)
This paper reviews academic research on the connections between agglomeration and innovation. The authors first describe the conceptual distinctions between invention and innovation. They then discuss how these factors are frequently measured in the data and note some resulting empirical regularities. Innovative activity tends to be more concentrated than industrial activity, and the authors discuss important findings from the literature about why this is so. The authors highlight the traits of cities (e.g., size, industrial diversity) that theoretical and empirical work link to innovation, and they discuss factors that help sustain these features (e.g., the localization of entrepreneurial finance).
(967 KB, 62 pages)
This paper uses a unique data set to shed new light on credit availability to consumer bankruptcy filers. In particular, the authors’ data allow them to distinguish between Chapter 7 and Chapter 13 bankruptcy filings, to observe changes in credit demand and credit supply explicitly, and to differentiate existing and new credit accounts. The paper has four main findings. First, despite speedy recovery in their risk scores after bankruptcy filing, most filers have much reduced access to credit in terms of credit limits, and the impact seems to be long lasting (well beyond the discharge date). Second, the reduction in credit access stems mainly from the supply side as consumer inquiries recover significantly after the filing, while credit limits remain low. Third, new lenders do not treat Chapter 13 filers more favorably than Chapter 7 filers. In fact, Chapter 13 filers are much less likely to receive new credit cards than Chapter 7 filers even after controlling for borrower characteristics and local economic environment. Finally, the authors find that Chapter 13 filers overall end up with a slightly larger credit limit amount than Chapter 7 filers (both after the filing and after discharge) because they are able to maintain more of their old credit from before bankruptcy filing. The authors’ results cast doubt on the effectiveness of the current bankruptcy system in providing relief to bankruptcy filers and especially its recent push to get debtors into Chapter 13.
Supersedes Working Paper 13-24.
(611 KB, 49 pages)
The authors study the impact that the liquidity crunch in 2008-2009 had on the U.S. economy’s growth trend. To this end, the authors propose a model featuring endogenous productivity a la Romer and a liquidity friction a la Kiyotaki-Moore. A key finding in the authors’ study is that liquidity declined around the Lehman Brothers’ demise, which led to the severe contraction in the economy. This liquidity shock was a tail event. Improving conditions in financial markets were crucial in the subsequent recovery. Had conditions remained at their worst level in 2008, output would have been 20 percent below its actual level in 2011. The authors show that a subsidy to entrepreneurs would have gone a long way averting the crisis.
(815 KB, 56 pages)
During the housing crisis, it came to be recognized that inflated home mortgage appraisals were widespread during the subprime boom. The New York State Attorney General’s office investigated this issue with respect to one particular lender and Fannie Mae and Freddie Mac. The investigation resulted in an agreement between the Attorney General’s office, the government-sponsored enterprises (GSEs), and the Federal Housing Finance Agency (the GSEs’ federal regulator) in 2008, in which the GSEs agreed to adopt the Home Valuation Code of Conduct (HVCC). Using unique data sets that contain both approved and nonapproved mortgage applications, this study provides an empirical examination of the impact of the HVCC on appraisal and mortgage outcomes. The results suggest that the HVCC has reduced the probability of inflated valuations and induced a significant increase in low appraisals. The HVCC also made it more difficult to obtain mortgages in the aftermath of the financial crisis.
(636 KB, 34 pages)
The surge in fiscal deficits since 2008 has put a renewed focus on our understanding of fiscal policy. The interaction of fiscal and monetary policy during this period has also been the subject of much discussion and analysis. This paper gives new insight into past fiscal policy and its influence on monetary policy by examining the U.S. Federal Reserve Board staff’s Greenbook forecasts of fiscal policy. The authors create a real-time database of the Greenbook forecasts of fiscal policy, examine the forecast performance in terms of bias and efficiency, and explore the implications for the interaction of fiscal policy and monetary policy. The authors also attempt to provide advice for fiscal policy by showing how policymakers learn over time about the trajectory of the U.S. federal government’s fiscal balance as well as the changing roles of structural and cyclical factors.
(2.71 MB, 53 pages)
Practically all industrialized economies restrict the length of time that credit bureaus can retain borrowers’ negative credit information. There is, however, a large variation in the permitted retention times across countries. By exploiting a quasi-experimental variation in this retention time, the authors investigate what happens when negative information is deleted earlier from credit files. The authors find that the loss of information led banks to tighten their lending standards significantly as the expected retention time was diminished from on average three-and-a-half to three years exactly. Simultaneously, they find that borrowers who experience this shorter retention time default more frequently. Since borrowers nevertheless obtain more net access to credit and total defaults do not increase overall, the authors cannot rule out that this reduction in retention time is optimal.
(1.3 MB, 52 pages)
An important component of the American Recovery and Reinvestment Act’s (ARRA’s) $796 billion proposed stimulus budget was $318 billion in fiscal assistance to state and local governments, yet the authors have no precise estimates of the impact of such assistance on the macroeconomy. In evaluating ARRA, both the Council of Economic Advisors (CEA) and the Congressional Budget Office (CBO) used instead the impacts of direct federal spending and tax relief. These estimates miss the role of states as agents. The authors provide estimates of aid’s multiplier effects allowing explicitly for state behavior, first from an SVAR analysis separating federal aid from federal tax relief, second from a narrative analysis using the political record for unanticipated federal aid programs, and third from constructed macroeconomic estimates implied by an estimated model of state governments’ fiscal choices. The authors reach three conclusions. First, federal transfers to state and local governments are less stimulative than transfers to households and firms. Second, federal aid for welfare spending is more stimulative than is general purpose aid. Third, an estimated model of state government fiscal behavior provides a microeconomic foundation for the observed macroeconomic impacts of aid.
(1.4 MB, 46 pages)
American politics have become extremely polarized in recent decades. This deep political divide has caused significant government dysfunction. Political divisions make the timing, size, and composition of government policy less predictable. According to existing theories, an increase in the degree of economic policy uncertainty or in the volatility of fiscal shocks results in a decline in economic activity. This occurs because businesses and households may be induced to delay decisions that involve high reversibility costs. In addition, disagreement between policymakers may result in stalemate, or, in extreme cases, a government shutdown. This adversely affects the optimal implementation of policy reforms and may result in excessive debt accumulation or inefficient public-sector responses to adverse shocks. Testing these theories has been challenging given the low frequency at which existing measures of partisan conflict have been computed. In this paper, the author provides a novel high-frequency indicator of the degree of partisan conflict. The index, constructed for the period 1891 to 2013, uses a search-based approach that measures the frequency of newspaper articles that report lawmakers’ disagreement about policy. The author shows that the long-run trend of partisan conflict behaves similarly to political polarization and income inequality, especially since the Great Depression. Its short-run fluctuations are highly related to elections, but unrelated to recessions. The lower-than-average values observed during wars suggest a “rally around the flag” effect. The author uses the index to study the effect of an increase in partisan conflict, equivalent to the one observed since the Great Recession, on business cycles. Using a simple VAR, the author finds that an innovation to partisan conflict increases government deficits and significantly discourages investment, output, and employment. Moreover, these declines are persistent, which may help explain the slow recovery observed since the 2007 recession ended.
(10.4 MB, 39 pages)
The authors are the first to show that the cost of personal bankruptcy filers traveling to their bankruptcy trustees affects bankruptcy choices. The authors use detailed balance sheet, income statement, and location data from 400,000 Canadian bankruptcies. To control for endogenous trustee selection, the authors use the location of local government offices as an instrument for the location of bankruptcy trustees (while filers interact with trustees, and trustees interact with local government, filers do not interact with the local government). The authors find that increased travel costs reduce the number of filings. Furthermore, for those individuals who do file, the authors find that their increased travel costs need to be compensated by increased financial benefits of bankruptcy. Filers without cars (higher travel costs), as well as those with jobs (higher opportunity costs), receive larger per-kilometer financial benefits from bankruptcy.
(736 KB, 48 pages)
The authors are the first to examine whether exogenous shocks cause personal bankruptcy through the balance sheet channel and/or the income statement channel. For identification, they examine the effect of exogenous, politically motivated government payments on 200,000 Canadian bankruptcy filings. The authors find support for the balance sheet channel, in that receipt of the exogenous cash increases the net balance sheet benefits of bankruptcy (unsecured debt discharged minus liquidated assets forgone) required by filers. The authors also find limited support for the income statement channel, in that exogenous payments reduce bankruptcy filings from individuals whose current expenses exceed their current income.
(582 KB, 42 pages)
There have been increasing concerns about the potential of larger banks acquiring community banks and the declining number of community banks, which would significantly reduce small business lending (SBL) and disrupt relationship lending. This paper examines the roles and characteristics of U.S. community banks in the past decade, covering the recent economic boom and downturn. The authors analyze risk characteristics (including the confidential ratings assigned by bank regulators) of acquired community banks, compare pre- and post-acquisition performance and stock market reactions to these acquisitions, and investigate how the acquisitions have affected SBL. Contrary to concerns, the authors find that the overall amount of SBL tends to increase after a large bank acquires a community bank. The ratio of SBL to assets does decline in the large acquiring banks but at a slower rate than the decline seen in surviving community banks. Further, community banks that were merged during the financial crisis were mostly in poor financial condition, had been rated as unsatisfactory by their regulators on all risk aspects, and would have been unlikely to continue lending. The authors find that community bank targets accepted smaller merger premiums (or even discounts) to be part of a large banking organization. Their results indicate that mergers involving community bank targets over the past decade have enhanced the overall safety and soundness of the banking system without adversely impacting SBL. This implies that a policy that discourages mergers between community banks and large banks is unwarranted and could potentially result in a weaker financial system and have an unintentional dampening effect on the supply of SBL.
(1.08 MB, 40 pages)
In a market in which sellers compete by posting mechanisms, the authors allow for a general meeting technology and show that its properties crucially affect the mechanism that sellers select in equilibrium. In general, it is optimal for sellers to post an auction without a reserve price but with a fee, paid by all buyers who meet with the seller. However, the authors define a novel condition on meeting technologies, which they call invariance, and show that meeting fees are equal to zero if and only if this condition is satisfied. Finally, the authors discuss how invariance is related to other properties of meeting technologies identified in the literature.
(459 KB, 21 pages)
The authors build a micro-founded two-country dynamic general equilibrium model in which trade responds more to a cut in tariffs in the long run than in the short run. The model introduces a time element to the fixed-variable cost trade-off in a heterogeneous producer trade model. Thus, the dynamics of aggregate trade adjustment arise from producer-level decisions to invest in lowering their future variable export costs. The model is calibrated to match salient features of new exporter growth and provides a new estimate of the exporting technology. At the micro level, the authors find that new exporters commonly incur substantial losses in the first three years in the export market and that export profits are backloaded. At the macro level, the slow export expansion at the producer level leads to sluggishness in the aggregate response of exports to a change in tariffs, with a long-run trade elasticity that is 2.9 times the short-run trade elasticity. The authors estimate the welfare gains from trade from a cut in tariffs, taking into account the transition period. While the intensity of trade expands slowly, consumption overshoots its new steady-state level, so the welfare gains are almost 15 times larger than the long-run change in consumption. Models without this dynamic export decision underestimate the gains to lowering tariffs, particularly when constrained to also match the gradual expansion of aggregate trade flows.
(848 KB, 48 pages)
The authors develop a model of banking industry dynamics to study the quantitative impact of capital requirements on bank risk taking, commercial bank failure, and market structure. They propose a market structure where big, dominant banks interact with small, competitive fringe banks. Banks accumulate securities like Treasury bills and undertake short-term borrowing when there are cash flow shortfalls. A nontrivial size distribution of banks arises out of endogenous entry and exit, as well as banks’ buffer stocks of securities. The authors test the model using business cycle properties and the bank lending channel across banks of different sizes studied by Kashyap and Stein (2000). They find that a rise in capital requirements from 4% to 6% leads to a substantial reduction in exit rates of small banks and a more concentrated industry. Aggregate loan supply falls and interest rates rise by 50 basis points. The lower exit rate causes the tax/output rate necessary to fund deposit insurance to drop in half. Higher interest rates, however, induce higher loan delinquencies as well as a lower level of intermediated output.
(644 KB, 58 pages)
The authors propose a theory of endogenous firm-level volatility over the business cycle based on endogenous market exposure. Firms that reach a larger number of markets diversify market-specific demand risk at a cost. The model is driven only by total factor productivity shocks and captures the business cycle properties of firm-level volatility. Using a panel of U.S. firms (Compustat), the authors empirically document the countercyclical nature of firm-level volatility. They then match this panel to Compustat's Segment data and the U.S. Census's Longitudinal Business Database (LBD) to show that, consistent with their model, measures of market reach are procyclical, and the countercyclicality of firm-level volatility is driven mostly by those firms that adjust the number of markets to which they are exposed. This finding is explained by the negative elasticity between various measures of market exposure and firm-level idiosyncratic volatility the authors uncover using Compustat, the LBD, and the Kauffman Firm Survey.
(634 KB, 58 pages)
The aim of this paper is to quantify the role of formal-sector institutions in shaping the demand for human capital and the level of informality. The authors propose a firm dynamics model where firms face capital market imperfections and costs of operating in the formal sector. Formal firms have a larger set of production opportunities and the ability to employ skilled workers, but informal firms can avoid the costs of formalization. These firm-level distortions give rise to endogenous formal and informal sectors and, more importantly, affect the demand for skilled workers. The model predicts that countries with a low degree of debt enforcement and high costs of formalization are characterized by relatively lower stocks of skilled workers, larger informal sectors, low allocative efficiency, and measured TFP. Moreover, the authors find that the interaction between entry costs and financial frictions (as opposed to the sum of their individual effects) is the main driver of these differences. This complementarity effect derives from the introduction of skilled workers, which prevents firms from substituting labor for capital and in turn moves them closer to the financial constraint.
(735 KB, 52 pages)
Credit card portfolios represent a significant component of the balance sheets of the largest US
banks. The charge-off rate in this asset class increased drastically during the Great Recession.
The recent economic downturn offers a unique opportunity to analyze the performance of credit
risk models applied to credit card portfolios under conditions of economic stress. Specifically,
the authors evaluate three potential sources of model risk: model specification, sample selection, and
stress scenario selection. Their analysis indicates that model specifications that incorporate
interactions between policy variables and core account characteristics generate the most accurate
loss projections across risk segments. Models estimated over a time frame that includes a
significant economic downturn are able to project levels of credit loss consistent with those
experienced during the Great Recession. Models estimated over a time frame that does not
include a significant economic downturn can severely under-predict credit loss in some cases,
and the level of forecast error can be significantly impacted by model specification assumptions.
Higher credit-score segments of the portfolio are proportionally more severely impacted by
downturn economic conditions and model specification assumptions. The selection of the stress
scenario can have a dramatic impact on projected loss.
(768 KB, 38 pages)
The authors develop a model of a two-sided asset market in which trades are intermediated by dealers and are bilateral. Dealers compete to attract order flow by posting the terms at which they execute trades, which can include prices, quantities, and execution times, and investors direct their orders toward dealers that offer the most attractive terms of trade. Equilibrium outcomes have the following properties. First, investors face a trade-off between trading costs and speeds of execution. Second, the asset market is endogenously segmented in the sense that investors with different asset valuations and different asset holdings will trade at different speeds and different costs. For example, under a Leontief technology to match investors and dealers, per unit trading costs decrease with the size of the trade, in accordance with the evidence from the market for corporate bonds. Third, dealers' implicit bargaining powers are endogenous and typically vary across sub-markets. Finally, the authors obtain a rich set of comparative statics both analytically, by studying a limiting economy where trading frictions are small, and numerically. For instance, the authors find that the relationship between trading costs and dealers' bargaining power can be hump-shaped.
(1 MB, 50 pages)
The deep housing market recession from 2008 through 2010 was characterized by a steep increase in the number of foreclosures. Foreclosure timelines — the length of time between initial mortgage delinquency and completion of foreclosure — also expanded significantly, averaging up to three years in some states. Most individuals undergoing foreclosure are experiencing serious financial stress. However, extended foreclosure timelines enable mortgage defaulters to live in their homes without making housing payments until the completion of the foreclosure process, thus providing a liquidity benefit. This paper tests whether the resulting liquidity was used to help cure nonmortgage credit delinquency. The authors find a significant relationship between longer foreclosure timelines and household performance on nonmortgage consumer credit during and after the foreclosure process. Their results indicate that a longer period of nonpayment of housing-related expenses results in higher cure rates on delinquent nonmortgage debts and improved household balance sheets. Foreclosure delay may have mitigated the impact of the economic downturn on credit card default. However, credit card performance may deteriorate in the future as the current foreclosure backlog is cleared and the affected households once again incur housing expenses.
(553 KB, 32 pages)
In the U.S., third-party debt collection agencies employ more than 140,000 people and recover more than $50 billion each year, mostly from consumers. Informational, legal, and other factors suggest that original creditors should have an advantage in collecting debts owed to them. Then, why does the debt collection industry exist and why is it so large? Explanations based on economies of scale or specialization cannot address many of the observed stylized facts. The authors develop an application of common agency theory that better explains those facts. The model explains how reliance on an unconcentrated industry of third-party debt collection agencies can implement an equilibrium with more intense collections activity than creditors would implement by themselves. The authors derive empirical implications for the nature of the debt collection market and the structure of the debt collection industry. A welfare analysis shows that, under certain conditions, an equilibrium in which creditors rely on third-party debt collectors can generate more credit supply and aggregate borrower surplus than an equilibrium where lenders collect debts owed to them on their own. There are, however, situations where the opposite is true. The model also suggests a number of policy instruments that may improve the functioning of the collections market.
(591 KB, 46 pages)
The authors prove that the standard quasi-geometric discounting model used in dynamic consumer theory and political economics does not possess continuous Markov perfect equilibria (MPE) if there is a strictly positive lower bound on wealth. The authors also show that, at points of discontinuity, the decision maker strictly prefers lotteries over the next period's assets. The authors then extend the standard model to have lotteries and establish the existence of an MPE with continuous decision rules. The models with and without lotteries are numerically compared, and it is shown that the model with lotteries behaves more in accord with economic intuition.
(637 KB, 53 pages)
The authors replicate the main results of Rudebusch and Williams (2009), who show that the use of the yield spread in a probit model can predict recessions better than the Survey of Professional Forecasters. Croushore and Marsten investigate the robustness of their results in several ways: extending the sample to include the 2007-09 recession, changing the starting date of the sample, changing the ending date of the sample, using rolling windows of data instead of just an expanding sample, and using alternative measures of the "actual" value of real output. The results show that the Rudebusch-Williams findings are robust in all dimensions.
(1.9 MB, 23 pages)
In high-dimensional factor models, both the factor loadings and the number of factors may change over time. This paper proposes a shrinkage estimator that detects and disentangles these instabilities. The new method simultaneously and consistently estimates the number of pre- and post-break factors, which liberates researchers from sequential testing and achieves uniform control of the family-wise model selection errors over an increasing number of variables. The shrinkage estimator only requires the calculation of principal components and the solution of a convex optimization problem, which makes its computation efficient and accurate. The finite sample performance of the new method is investigated in Monte Carlo simulations. In an empirical application, the authors study the change in factor loadings and emergence of new factors during the Great Recession.
(815 KB, 85 pages)
Using data from the Survey of Income and Program Participation (SIPP) covering 1990-2011, the authors document that a surprisingly large number of workers return to their previous employer after a jobless spell and experience more favorable labor market outcomes than job switchers. Over 40% of all workers separating into unemployment regain employment at their previous employer; over a fifth of them are permanently separated workers who did not have any expectation of recall, unlike those on temporary layoff. Recalls are associated with much shorter unemployment duration and better wage changes. Negative duration dependence of unemployment nearly disappears once recalls are excluded. The authors also find that the probability of finding a new job is more procyclical and volatile than the probability of a recall. Incorporating this fact into an empirical matching function significantly alters its estimated elasticity and the time-series behavior of matching efficiency, especially during the Great Recession. The authors develop a canonical search-and-matching model with a recall option where new matches are mediated by a matching function, while recalls are free and triggered by both aggregate and job-specific shocks. The recall option is lost when the unemployed worker accepts a new job. A quantitative version of the model captures well the authors' cross-sectional and cyclical facts through selection of recalled matches.
(683 KB, 56 pages)
A monetary authority can be committed to pursuing an inflation, price-level, or nominal output target yet systematically fail to achieve the specified goal. Constrained by the zero lower bound on the policy rate, the monetary authority is unable to implement its objectives when private-sector expectations stray from the target in the first place. Low-inflation expectations become self-fulfilling, resulting in an additional Markov equilibrium in which both nominal and real variables are typically below target. Introducing a stabilization goal for long-term nominal rates anchors private-sector expectations on a unique Markov equilibrium without fully compromising the policy responses to shocks.
(551 KB, 37 pages)
The author constructs the life-cycle model with equilibrium default and preferences featuring temptation and self-control. The model provides quantitatively similar answers to positive questions such as the causes of the observed rise in debt and bankruptcies and macroeconomic implications of the 2005 bankruptcy reform, as the standard model without temptation. However, the temptation model provides contrasting welfare implications, because of overborrowing when the borrowing constraint is relaxed. Specifically, the 2005 bankruptcy reform has an overall negative welfare effect, according to the temptation model, while the effect is positive in the no-temptation model. As for the optimal default punishment, welfare of the agents without temptation is maximized when defaulting results in severe punishment, which provides a strong commitment to repaying and thus a lower default premium. On the other hand, welfare of agents with temptation is maximized when weak punishment leads to a tight borrowing constraint, which provides a commitment against overborrowing.
(612 KB, 40 pages)