Explore This Section

Working Papers 2019: Abstracts

Please Note: If the title of a paper is highlighted, you can get to the full text for that paper by clicking on the highlighted area. Full text files are in pdf format; to access them, you must have Adobe Reader. External Link

19-27: Pre-event Trends in the Panel Event-Study Design by Simon Freyaldenhoven, Christian Hansen, and Jesse M. Shapiro

We consider a linear panel event-study design in which unobserved confounds may be related both to the outcome and to the policy variable of interest. We provide sufficient conditions to identify the causal effect of the policy by exploiting covariates related to the policy only through the confounds. Our model implies a set of moment equations that are linear in parameters. The effect of the policy can be estimated by 2SLS, and causal inference is valid even when endogeneity leads to pre-event trends ("pre-trends") in the outcome. Alternative approaches perform poorly in our simulations.
PDF Icon (2.0 MB, 54 pages)

19-26: Should Central Banks Issue Digital Currency? by Todd Keister and Daniel Sanches

We study how the introduction of a central bank-issued digital currency affects interest rates, the level of economic activity, and welfare in an environment where both central bank money and private bank deposits are used in exchange. Banks in our model are financially constrained, and the liquidity premium on bank deposits affects the level of aggregate investment. We study the optimal design of a digital currency in this setting, including whether it should pay interest and how widely it should circulate. We highlight an important policy tradeoff: while a digital currency tends to promote efficiency in exchange, it can also crowd out bank deposits, raise banks' funding costs, and decrease investment. Despite these effects, introducing a central bank digital currency often raises welfare.


PDF Icon (385.0 KB, 32 pages)

19-25: Financial Characteristics of Cost of Funds Indexed Loans by Patrick Greenfield and Arden Hall

Two recent articles by Hancock and Passmore (2016) and Passmore and von Hafften (2017) make several suggestions for improving the home mortgage contract to make homeownership more achievable for creditworthy borrowers. Though the proposals in the two papers differ in some aspects, one common feature is an adjustable rate indexed to a cost of funds (COF) measure. Such indices are based on the interest expense as a fraction of liability balance for one or a group of depository institutions. One of these, the 11th District Cost of Funds (COF) Index, was in wide use in the 1980s and ’90s, but use has fallen off since then. COF indices have the advantage that they are less volatile than market-based indices such as the 1-year U.S. Treasury rate, so that borrowers are not exposed to rapid increases in payments in a rising rate environment. We analyze COF-indexed ARMs from the point of view of the lender. First we develop a methodology for constructing a liability portfolio that closely tracks the specific COF index proposed by Hancock and Passmore (2016) and Passmore and von Hafften (2017). We then explore the financial characteristics of this liability portfolio. We show that the liability portfolio, and by implication, the mortgages it would fund, share a characteristic of fixed-rate mortgages: Values can vary significantly from par if rates change. This creates two problems for lenders: Pricing of COF-indexed ARMs is difficult because it depends not only on current interest rates but also on interest rates when principal is repaid, either through amortization or prepayment. Second, deviations from par make mortgage prepayment options valuable, so that lenders offering the product must manage option risk as well as interest rate risk. We conclude that while mortgages using a COF index have clear benefits for borrowers, they also are more difficult for lenders to price accurately. Further, once they are in lenders’ portfolios, they increase the complexity of interest rate risk management. While these issues do not imply that COF indices cannot be part of innovative new mortgage designs, understanding their financial characteristics may contribute to the search for a better mortgage.
PDF Icon (500.0 KB, 22 pages)

19-24: Institution, Major, and Firm-Specific Premia: Evidence from Administrative Data by Ben Ost, Weixiang Pan, and Douglas Webber

We examine how a student’s major and the institution attended contribute to the labor market outcomes of young graduates. Administrative panel data that combine student transcripts with matched employer-employee records allow us to provide the first decomposition of premia into individual and firm-specific components. We find that both major and institutional premia are more strongly related to the firm-specific component of wages than the individual-specific component of wages. On average, a student’s major is a more important predictor of future wages than the selectivity of the institution attended, but major premia (and their relative ranking) can differ substantially across institutions, suggesting the importance of program-level data for prospective students and their parents.
PDF Icon (377.0 KB, 26 pages)

19-23: A Generalized Factor Model with Local Factors by Simon Freyaldenhoven

I extend the theory on factor models by incorporating local factors into the model. Local factors only affect an unknown subset of the observed variables. This implies a continuum of eigenvalues of the covariance matrix, as is commonly observed in applications. I derive which factors are pervasive enough to be economically important and which factors are pervasive enough to be estimable using the common principal component estimator. I then introduce a new class of estimators to determine the number of those relevant factors. Unlike existing estimators, my estimators use not only the eigenvalues of the covariance matrix, but also its eigenvectors. I find strong evidence of local factors in a large panel of US macroeconomic indicators.
Appendix
PDF Icon (3.0 MB, 47 pages)

19-22: Consumer Lending Efficiency: Commercial Banks versus a Fintech Lender by Joseph P. Hughes, Julapa Jagtiani, and Choon-Geol Moon

We compare the performance of unsecured personal installment loans made by traditional bank lenders with that of LendingClub, using a stochastic frontier estimation technique to decompose the observed nonperforming loans into three components. The first is the best-practice minimum ratio that a lender could achieve if it were fully efficient at credit-risk evaluation and loan management. The second is a ratio that reflects the difference between the observed ratio (adjusted for noise) and the minimum ratio that gauges the lender’s relative proficiency at credit analysis and loan monitoring. The third is statistical noise. In 2013 and 2016, the largest bank lenders experienced the highest ratio of nonperformance, the highest inherent credit risk, and the highest lending efficiency, indicating that their high ratio of nonperformance is driven by inherent credit risk, rather than by lending inefficiency. LendingClub’s performance was similar to small bank lenders as of 2013. As of 2016, LendingClub’s performance resembled the largest bank lenders — the highest ratio of nonperforming loans, inherent credit risk, and lending efficiency — although its loan volume was smaller. Our findings are consistent with a previous study that suggests LendingClub became more effective in risk identification and pricing starting in 2015. Caveat: We note that this conclusion may not be applicable to fintech lenders in general, and the results may not hold under different economic conditions such as a downturn.
PDF Icon (770.0 KB, 37 pages)

19-21: Demographic Aging, Industrial Policy, and Chinese Economic Growth by Michael Dotsey, Wenli Li, and Fang Yang

We examine the role of demographics and changing industrial policies in accounting for the rapid rise in household savings and in per capita output growth in China since the mid-1970s. The demographic changes come from reductions in the fertility rate and increases in the life expectancy, while the industrial policies take many forms. These policies cause important structural changes; first benefiting private labor-intensive firms by incentivizing them to increase their share of employment, and later on benefiting capital-intensive firms resulting in an increasing share of capital devoted to heavy industries. We conduct our analysis in a general equilibrium economy that also features endogenous human capital investment. We calibrate the model to match key economic variables of the Chinese economy and show that demographic changes and industrial policies both contributed to increases in savings and output growth but with differing intensities and at different horizons. We further demonstrate the importance of endogenous human capital investment in accounting for the economic growth in China.
PDF Icon (825.0 KB, 45 pages)

19-20: Capitalization as a Two-Part Tariff: The Role of Zoning by H. Spencer Banzhaf and Kyle Mangum

This paper shows that the capitalization of local amenities is effectively priced into land via a two-part pricing formula: a “ticket” price paid regardless of the amount of housing service consumed and a “slope” price paid per unit of services. We first show theoretically how tickets arise as an extensive margin price when there are binding constraints on the number of households admitted to a neighborhood. We use a large national dataset of housing transactions, property characteristics, and neighborhood attributes to measure the extent to which local amenities are capitalized in ticket prices vis-à-vis slopes. We find that in most U.S. cities, the majority of neighborhood variation in pricing occurs via tickets, although the importance of tickets rises sharply in the stringency of land development regulations, as predicted by theory. We discuss implications of two-part pricing for efficiency and equity in neighborhood sorting equilibria and for empirical estimates of willingness to pay for non marketed amenities, which generally assume proportional pricing only.
PDF Icon (1.0 MB, 88 pages)

19-19: Mortgage Loss Severities: What Keeps Them So High? by Xudong An and Larry Cordell

Mortgage loss-given-default (LGD) increased significantly when house prices plummeted and delinquencies rose during the financial crisis, but it has remained over 40 percent in recent years despite a strong housing recovery. Our results indicate that the sustained high LGDs post-crisis are due to a combination of an overhang of crisis-era foreclosures and prolonged foreclosure timelines, which have offset higher sales recoveries. Simulations show that cutting foreclosure timelines by one year would cause LGD to decrease by 5–8 percentage points, depending on the trade-off between lower liquidation expenses and lower sales recoveries. Using difference-in-differences tests, we also find that recent consumer protection programs have extended foreclosure timelines and increased loss severities in spite of their benefits of increasing loan modifications and enhancing consumer protections.
Supersedes Working Paper 17-08.
PDF Icon (838.0 KB, 49 pages)

19-18: The Firm Size and Leverage Relationship and Its Implications for Entry and Concentration in a Low Interest Rate World by Satyajit Chatterjee and Burcu Eyigungor

Larger firms (by sales or employment) have higher leverage. This pattern is explained using a model in which firms produce multiple varieties and borrow with the option to default against their future cash flow. A variety can die with a constant probability, implying that bigger firms (those with more varieties) have lower coefficient of variation of sales and higher leverage. A lower risk-free rate benefits bigger firms more as they are able to lever more and existing firms buy more of the new varieties arriving into the economy. This leads to lower startup rates and greater concentration of sales.
PDF Icon (577.0 KB, 37 pages)

19-17: Building Credit History with Heterogeneously Informed Lenders by Natalia Kovrijnykh, Igor Livshits, and Ariel Zetlin-Jones

This paper examines a novel mechanism of credit-history building as a way of aggregating information across multiple lenders. We build a dynamic model with multiple competing lenders, who have heterogeneous private information about a consumer's creditworthiness, and extend credit over multiple stages. Acquiring a loan at an early stage serves as a positive signal | it allows the borrower to convey to other lenders the existence of a positively informed lender (advancing that early loan) — thereby convincing other lenders to extend further credit in future stages. This signaling may be costly to the least risky borrowers for two reasons. First, taking on an early loan may involve cross-subsidization from the least risky borrowers to more risky borrowers. Second, the least risky borrowers may take inefficiently large loans relative to the symmetric-information benchmark. We demonstrate that, despite these two possible costs, the least risky borrowers often prefer these equilibria to those without information aggregation. Our analysis offers an interesting and novel insight into debt dilution. Contrary to the conventional wisdom, repayment of the early loan is more likely when a borrower subsequently takes on a larger rather than a smaller additional loan. This result hinges on a selection effect: larger subsequent loans are only given to the least risky borrowers.
PDF Icon (506.0 KB, 45 pages)

19-16: Beautiful City: Leisure Amenities and Urban Growth by Gerald A. Carlino and Albert Saiz

Modern urban economic theory and policymakers are coming to see the provision of consumer-leisure amenities as a way to attract population, especially the highly skilled and their employers. However, past studies have arguably only provided indirect evidence of the importance of leisure amenities for urban development. In this paper, we propose and validate the number of tourist trips and the number of crowdsourced picturesque locations as measures of consumer revealed preferences for local lifestyle amenities. Urban population growth in the 1990-2010 period was about 10 percentage points (about one standard deviation) higher in a metro area that was perceived as twice more picturesque. This measure ties with low taxes as the most important predictor of urban population growth. “Beautiful cities” disproportionally attracted highly educated individuals and experienced faster housing price appreciation, especially in supply-inelastic markets. In contrast to the generally declining trend of the American central city, neighborhoods that were close to central recreational districts have experienced economic growth, albeit at the cost of minority displacement.
Supersedes Working Paper 08-22.
PDF Icon (1.0 MB, 57 pages)

19-15: Banking Regulation with Risk of Sovereign Default by Pablo D’Erasmo, Igor Livshits, and Koen Schoors

Banking regulation routinely designates some assets as safe and thus does not require banks to hold any additional capital to protect against losses from these assets. A typical such safe asset is domestic government debt. There are numerous examples of banking regulation treating domestic government bonds as “safe,” even when there is clear risk of default on these bonds. We show, in a parsimonious model, that this failure to recognize the riskiness of government debt allows (and induces) domestic banks to “gamble” with depositors’ funds by purchasing risky government bonds (and assets closely correlated with them). A sovereign default in this environment then results in a banking crisis. Critically, we show that permitting banks to gamble this way lowers the cost of borrowing for the government. Thus, if the borrower and the regulator are the same entity (the government), that entity has an incentive to ignore the riskiness of the sovereign bonds. We present empirical evidence in support of the key mechanism we are highlighting, drawing on the experience of Russia in the run-up to its 1998 default and on the recent Eurozone debt crisis.
PDF Icon (505.0 KB, 42 pages)

19-14: We Are All Behavioral, More or Less: Measuring and Using Consumer-Level Behavioral Sufficient Statistics by Victor Stango and Jonathan Zinman

Can a behavioral sufficient statistic empirically capture cross-consumer variation in behavioral tendencies and help identify whether behavioral biases, taken together, are linked to material consumer welfare losses? Our answer is yes. We construct simple consumer-level behavioral sufficient statistics — “B-counts” — by eliciting seventeen potential sources of behavioral biases per person, in a nationally representative panel, in two separate rounds nearly three years apart. B-counts aggregate information on behavioral biases within-person. Nearly all consumers exhibit multiple biases, in patterns assumed by behavioral sufficient statistic models (a la Chetty), and with substantial variation across people. B-counts are stable within-consumer over time, and that stability helps to address measurement error when using B-counts to model the relationship between biases, decision utility, and experienced utility. Conditional on classical inputs — risk aversion and patience, life-cycle factors and other demographics, cognitive and non-cognitive skills, and financial resources — B-counts strongly negatively correlate with both objective and subjective aspects of experienced utility. The results hold in much lower-dimensional models employing “Sparsity B-counts” based on bias subsets (a la Gabaix) and/or fewer covariates, illuminating lower-cost ways to use behavioral sufficient statistics to help capture the combined influence of multiple behavioral biases for a wide range of research questions and applications.
PDF Icon (1.0 MB, 98 pages)

19-13: A Shortage of Short Sales: Explaining the Underutilization of a Foreclosure Alternative by Calvin Zhang

The Great Recession led to widespread mortgage defaults, with borrowers resorting to both foreclosures and short sales to resolve their defaults. I first quantify the economic impact of foreclosures relative to short sales by comparing the home price implications of both. After accounting for omitted variable bias, I find that homes selling as short sales transact at 9.2% to 10.5% higher prices on average than those that sell after foreclosure. Short sales also exert smaller negative externalities than foreclosures, with one short sale decreasing nearby property values by 1 percentage point less than a foreclosure. So why weren’t short sales more prevalent? These home price benefits did not increase the prevalence of short sales because free rents during foreclosures caused more borrowers to select foreclosures, even though higher advances led servicers to prefer more short sales. In states with longer foreclosure timelines, the benefits from foreclosures increased for borrowers, so short sales were less utilized. I find that one standard deviation increase in the average length of the foreclosure process decreased the short sale share by 0.35 to 0.45 standard deviation. My results suggest that policies that increase the relative attractiveness of short sales could help stabilize distressed housing markets.
PDF Icon (1.0 MB, 64 pages)

19-12 Revised: A Dynamic Model of Intermediated Consumer Credit and Liquidity by Pedro Gomis-Porqueras and Daniel Sanches

We construct a tractable model of consumer credit with settlement frictions (i.e., a consumer credit market that relies on a secondary market for privately issued debt claims to operate) to study the role of monetary policy in the efficient functioning of the payments system. In our framework, intermediaries hold reserves across periods to take advantage of rediscounting opportunities, and monetary policy influences the equilibrium allocation through the interest rate on reserves. We characterize the conditions for the existence of an allocation in which privately issued debt claims are not discounted in equilibrium. We also discuss the role of monetary policy in the payments system across different market structures in the intermediary sector and characterize the minimum size of the intermediary sector required to attain efficiency.
PDF Icon (360.0 KB, 40 pages)

19-11: Toward a Framework for Time Use, Welfare, and Household-Centric Economic Measurement by Diane Coyle and Leonard Nakamura

What is meant by economic progress, and how should it be measured? The conventional answer is growth in real GDP over time or compared across countries, a monetary measure adjusted for the general rate of increase in prices. However, there is increasing interest in developing an alternative understanding of economic progress, particularly in the context of digitalization of the economy and the consequent significant changes Internet use is bringing about in production and household activity. This paper discusses one alternative approach, combining an extended utility framework considering time allocation over paid work, household work, leisure, and consumption with measures of objective or subjective well-being while engaging in different activities. Developing this wider economic welfare measure would require the collection of time use statistics as well as well-being data and direct survey evidence, such as the willingness to pay for leisure time. We advocate an experimental set of time and well-being accounts, with a particular focus on the digitally driven shifts in behavior.
PDF Icon (492.0 KB, 26 pages)

19-10: Frictional Intermediation in Over-the-Counter Markets by Julien Hugonnier, Benjamin Lester, and Pierre-Olivier Weill

We extend Duffie, Gârleanu, and Pedersen’s (2005) search-theoretic model of over-the-counter (OTC) asset markets, allowing for a decentralized inter-dealer market with arbitrary heterogeneity in dealers’ valuations or inventory costs. We develop a solution technique that makes the model fully tractable and allows us to derive, in closed form, theoretical formulas for key statistics analyzed in empirical studies of the intermediation process in OTC markets. A calibration to the market for municipal securities reveals that the model can generate trading patterns and prices that are quantitatively consistent with the data. We use the calibrated model to compare the gains from trade that are realized in this frictional market with those from a hypothetical, frictionless environment, and to distinguish between the quantitative implications of various types of heterogeneity across dealers.
Supersedes Working Paper 15-22.
PDF Icon (549.0 KB, 80 pages)

19-09: Investigating Nonneutrality in a State-Dependent Pricing Model with Firm-Level Productivity Shocks by Michael Dotsey and Alexander L. Wolman

In recent years, there has been an abundance of empirical work examining price setting behavior at the micro level. First generation models with price setting rigidities were generally at odds with much of the micro price data. A second generation of models, with fixed costs of price adjustment and idiosyncratic shocks, have attempted to rectify this shortcoming. Using a model that matches a large set of microeconomic facts we find significant nonneutrality. We decompose the nonneutrality and find that state dependence plays an important part in the responses of output and inflation to a monetary shock. We also examine how aggregating firm behavior can generate flat hazards. Last, we find that the steady state statistic developed by Alvarez, Le Bihan, and Lippi (2016) is an imperfect guide to characterizing nonneutrality in our model.
PDF Icon (596.0 KB, 41 pages)

19-08: From Incurred Loss to Current Expected Credit Loss (CECL): A Forensic Analysis of the Allowance for Loan Losses in Unconditionally Cancelable Credit Card Portfolios by José J. Canals-Cerdá

The Current Expected Credit Loss (CECL) framework represents a new approach for calculating the allowance for credit losses. Credit cards are the most common form of revolving consumer credit and are likely to present conceptual and modeling challenges during CECL implementation. We look back at nine years of account-level credit card data, starting with 2008, over a time period encompassing the bulk of the Great Recession as well as several years of economic recovery. We analyze the performance of the CECL framework under plausible assumptions about allocations of future payments to existing credit card loans, a key implementation element. Our analysis focuses on three major themes: defaults, balances, and credit loss. Our analysis indicates that allowances are significantly impacted by specific payment allocation assumptions as well as downturn economic conditions. We also compare projected allowances with realized credit losses and observe a significant divergence resulting from the revolving nature of credit card portfolios. We extend our analysis across segments of the portfolio with different risk profiles. Interestingly, fewer risky segments of the portfolio are proportionally more impacted by specific payment assumptions and downturn economic conditions. Our findings suggest that the effect of the new allowance framework on a specific credit card portfolio will depend critically on its risk profile. Thus, our findings should be interpreted qualitatively, rather than quantitatively. Finally, the goal is to gain a better understanding of the sensitivity of allowances to plausible variations in assumptions about the allocation of future payments to present credit card loans. Thus, we do not offer specific best practice guidance.
PDF Icon (1.0 MB, 41 pages)

19-07: Incumbency Disadvantage of Political Parties: The Role of Policy Inertia and Prospective Voting by Satyajit Chatterjee and Burcu Eyigungor

We document that postwar U.S. elections show a strong pattern of “incumbency disadvantage": If a party has held the presidency of the country or the governorship of a state for some time, that party tends to lose popularity in the subsequent election. To explain this fact, we employ Alesina and Tabellini's (1990) model of partisan politics, extended to have elections with prospective voting. We show that inertia in policies, combined with sufficient uncertainty in election outcomes, implies incumbency disadvantage. We find that inertia can cause parties to target policies that are more extreme than the policies they would support in the absence of inertia and that such extremism can be welfare reducing.
Supersedes Working Paper 17-43.
PDF Icon (698.0 KB, 64 pages)

19-06: How Big Is the Wealth Effect? Decomposing the Response of Consumption to House Prices by S. Borağan Aruoba, Ronel Elul, and Şebnem Kalemli-Özcan

We investigate the effect of declining house prices on household consumption behavior during 2006–2009. We use an individual-level dataset that has detailed information on borrower characteristics, mortgages and credit risk. Proxying consumption by individual-level auto loan originations, we decompose the effect of declining house prices on consumption into three main channels: wealth effect, household financial constraints, and bank health. We find a negligible wealth effect. Tightening household-level financial constraints can explain 40-45 percent of the response of consumption to declining house prices. Deteriorating bank health leads to reduced credit supply both to households and firms. Our dataset allows us to estimate the effect of this on households as 20-25 percent of the consumption response. The remaining 35 percent is a general equilibrium effect that works via a decline in employment as a result of either lower credit supply to firms or the feedback from lower consumer demand. Our estimate of a negligible wealth effect is robust to accounting for the endogeneity of house prices and unemployment. The contribution of tightening household financial constraints goes down to 35 percent, whereas declining bank credit supply to households captures about half of the overall consumption response, once we account for endogeneity.
PDF Icon (552.0 KB, 36 pages)

19-05: Firm Wages in a Frictional Labor Market by Leena Rudanko

This paper studies a labor market with directed search, where multi-worker firms follow a firm wage policy: They pay equally productive workers the same. The policy reduces wages, due to the influence of firms’ existing workers on their wage setting problem, increasing the profitability of hiring. It also introduces a time-inconsistency into the dynamic firm problem, because firms face a less elastic labor supply in the short run. To consider outcomes when firms reoptimize each period, I study Markov perfect equilibria, proposing a tractable solution approach based on standard Euler equations. In two applications, I first show that firm wages dampen wage variation over the business cycle, amplifying that in unemployment, with quantitatively significant effects. Second, I show that firm wage firms may find it profitable to fix wages for a period of time, and that an equilibrium with fixed wages can be good for worker welfare, despite added volatility in the labor market.
PDF Icon (671.0 KB, 61 pages)

19-04: Bank Size and Household Financial Sentiment: Surprising Evidence from the University of Michigan Surveys of Consumers by Allen N. Berger, Felix Irresberger, and Raluca A. Roman

We analyze comparative advantages/disadvantages of small and large banks in improving household sentiment regarding financial conditions. We match sentiment data from the University of Michigan Surveys of Consumers with local banking market data from 2000 to 2014. Surprisingly, the evidence suggests that large rather than small banks have significant comparative advantages in boosting household sentiment. Findings are robust to instrumental variables and other econometric methods. Additional analyses are consistent with both scale economies and the superior safety of large banks as channels behind the main findings. These channels appear to more than offset stronger relationships with and greater trust in small banks.
PDF Icon (1.0 MB, 69 pages)

19-03 Revised: Elasticities of Labor Supply and Labor Force Participation Flows by Isabel Cairó, Shigeru Fujita, and Camilo Morales-Jiménez

Using a representative-household search and matching model with endogenous labor force participation, we study the interactions between extensive-margin labor supply elasticities and the cyclicality of labor force participation flows. Our model successfully replicates salient business-cycle features of all transition rates between three labor market states, the unemployment rate, and the labor force participation rate, while using values of elasticities consistent with micro evidence. Our results underscore the importance of the procyclical opportunity cost of employment, together with wage rigidity, in understanding the cyclicality of labor market flows and stocks.
PDF Icon (6.0 MB, 71 pages)

19-02: Financial Consequences of Identity Theft: Evidence from Consumer Credit Bureau Records by Nathan Blascak, Julia Cheney, Robert Hunt, Vyacheslav Mikhed, Dubravka Ritter, and Michael Vogan

This paper examines how a negative shock to the security of personal finances due to severe identity theft changes consumer credit behavior. Using a unique data set of consumer credit records and alerts indicating identity theft and the exogenous timing of victimization, we show that the immediate effects of fraud on credit files are typically negative, small, and transitory. After those immediate effects fade, identity theft victims experience persistent, positive changes in credit characteristics, including improved Risk Scores. Consumers also exhibit caution with credit by having fewer open revolving accounts while maintaining total balances and credit limits. Our results are consistent with consumer inattention to credit reports prior to identity theft and reduced trust in credit card markets after identity theft.
Supersedes Working Paper 16-27.
PDF Icon (1.0 MB, 46 pages)

19-01: Leaving Households Behind: Institutional Investors and the U.S. Housing Recovery by Lauren Lambie-Hanson, Wenli Li, and Michael Slonkosky

Ten years after the mortgage crisis, the U.S. housing market has rebounded significantly with house prices now near the peak achieved during the boom. Homeownership rates, on the other hand, have continued to decline. We reconcile the two phenomena by documenting the rising presence of institutional investors in this market. Our analysis makes use of housing transaction data. By exploiting heterogeneity in zip codes' exposure to the First Look program instituted by Fannie Mae and Freddie Mac that affected investors' access to foreclosed properties, we establish the causal relationship between the increasing presence of institutions in the housing market and the subsequent recovery in house prices and decline in homeownership rates between 2007 and 2014. We further demonstrate that institutional investors contributed to the improvement in the local labor market by reducing overall unemployment rate and by increasing total employment, construction employment in particular. Local housing rents also rose.
PDF Icon (414.0 KB, 33 pages)