Combining satellite imagery with machine learning (SIML) has the potential to address global challenges by remotely estimating socioeconomic and environmental conditions in data-poor regions, yet the resource requirements of SIML limit its accessibility and use. Rolf, Proctor, Carleton, Bolliger, Shankar, Ishihara, Recht, and Hsiang show that a single encoding of satellite imagery can generalize across diverse prediction tasks (e.g. forest cover, house price, road length). Their method achieves accuracy competitive with deep neural networks at orders of magnitude lower computational cost, scales globally, delivers label super-resolution predictions, and facilitates characterizations of uncertainty. Since image encodings are shared across tasks, they can be centrally computed and distributed to unlimited researchers, who need only fit a linear regression to their own ground truth data in order to achieve state-of-the-art SIML performance.
The value of time determines relative prices of goods and services, investments, productivity, economic growth, and measurements of income inequality. Economists in the 1960s began to focus on the value of non-work time, pioneering a deep literature exploring the optimal allocation and value of time. By leveraging key features of these classic time allocation theories, Goldszmidt, List, Metcalfe, Muir, Smith, and Wang use a novel approach to estimate the value of time (VOT) via two large-scale natural field experiments with the ridesharing company Lyft. They use random variation in both wait times and prices to estimate a consumer's VOT with a data set of more than 14 million observations across consumers in U.S. cities. The researchers find that the VOT is roughly $19 per hour (or 75% (100%) of the after-tax mean (median) wage rate) and varies predictably with choice circumstances correlated with the opportunity cost of wait time. Their VOT estimate is larger than what is currently used by the U.S. Government, suggesting that society is under-valuing time improvements and subsequently under-investing public resources in time-saving infrastructure projects and technologies.
In addition to the conference paper, the research was distributed as NBER Working Paper w28208, which may be a more recent version.
Evaluations of energy efficiency programs reveal that realized savings consistently fall short of projections. Christensen, Francisco, Myers, and Souza decompose this 'performance wedge' using data from the Illinois Home Weatherization Assistance Program (IHWAP) and a machine learning-based event study research design. They find that bias in engineering models can account for up to 41% of the wedge, primarily from overestimated savings in wall insulation. Heterogeneity in workmanship can also account for a large fraction (43%) of the wedge, while the rebound effect can explain only 6%. The researchers find substantial heterogeneity in energy-related benefits from IHWAP projects, suggesting opportunities for better targeting of investments.
This paper estimates the Brazilian Amazon's efficient forestation level. Araujo, Costa, and Sant'Anna propose a dynamic discrete choice model of land use and estimate it using a remote sensing panel with land use and stock of carbon of 5.7 billion pixels, at 30 meters resolution, between 2008 and 2017. Araujo, Costa, and Sant'Anna estimate that a business as usual scenario will generate an inefficient loss of 1,075,000 km2 of forest cover in the long run, an area almost two times the size of France, implying the release of 44 billion tons of CO2 . The researchers quantify the potential of carbon and cattle production taxes to mitigate inefficient deforestation. They find that relatively small carbon taxes can mitigate a substantial part of the inefficient forest loss and emissions, while only very large taxes on cattle production would achieve a similar effect.
Costello and Kotchen examine the interplay between environmental policy instrument choice (i.e., prices vs. quantities) and private provision of public goods, which in this context they denote 'Coasean provision.' Coasean provision captures private provision of environmental public goods due to consumer preferences for environmentally friendly goods and services, incentives for corporate environmental management, environmental philanthropy, and even overlapping jurisdictions of policy. Costello and Kotchen show theoretically that even in a world of perfect certainty, the presence of Coasean provision distinctly affects instrument choice based on the efficiency criterion. The researchers generalize the analysis to account for uncertainty using the classic Weitzman (1974) framework, showing that Coasean provision results in a favoring of prices over quantities with uncertainty over either the marginal benefits or costs of pollution. Their findings suggest that the increasing prevalence of Coasean provision motivates a need in many settings to rethink the design of effective and efficient environmental policy instruments.
In addition to the conference paper, the research was distributed as NBER Working Paper w28130, which may be a more recent version.
In June 2020, the Environmental Protection Agency and the Army Corps of Engineers narrowed the definition of 'Waters of the United States' (WOTUS), significantly limiting wetland protection under the Clean Water Act. Current policy debates and litigation center on the uncertainty around wetland benefits, especially concerning damages from flooding, the most costly and frequent natural disaster. The study estimates the value of wetlands for flood mitigation across the entire US using detailed flood claim and land use data. Employing three different identification strategies, Taylor and Druckenmiller find that a hectare of wetland provides $2,300 in annual flood mitigation value when accounting for spatial spillovers. Their results indicate that wetland loss between 2001 and 2016 increases flood claims by $535 million (or 19%) annually. The spatial heterogeneity Taylor and Druckenmiller document in wetland benefits has implications for the WOTUS rule change affecting the 50% of 'isolated' wetlands.
Politicians may target public goods to benefit their constituents, at the expense of others. Mahadevan studies this phenomenon in the context of Indian electricity and estimate the distributive consequences. Using new administrative billing data, she shows that billed electricity consumption is lower for constituencies of the winning party, while actual consumption, measured by nighttime lights, is higher, using close-election regression discontinuities. The researcher documents the covert way in which politicians subsidize constituents via manipulating bills. These actions have a substantial deadweight loss of over $0.5 billion, hurting utilities' ability to provide reliable electricity, which has significant negative consequences for development.
Although improving agricultural productivity is vital to anti-poverty and food security goals, its ecological effects are theoretically ambiguous. Increasing the relative value of agricultural land may spur deforestation, but factor market constraints paired with improvements in existing land productivity may reduce the demand for shifting cultivation. Leveraging the discontinuity in eligibility for a large agricultural extension program, Abman, Garg, Pan, and Singhal find that the program reduced deforestation by 13%. The program increased adoption of promoted practices such as manure-use and crop rotation resulting in higher productivity but no increase in cultivated area. Suitably designed programs improving agricultural productivity may also enable conservation.
Natural disaster losses can be mitigated through investments in structure hardening. When property owners do not correctly perceive risks or there are spatial externalities, it may be beneficial to mandate such investments through building codes. Baylis and Boomhower provide the first comprehensive evaluation of the effect of California's wildfire building codes on structure survival. They combine administrative damage data from several states, representing almost all U.S. homes destroyed by wildfire since 2007. The researchers merge this damage data to the universe of assessor data for destroyed and surviving homes inside wildfire perimeters. There are remarkable vintage effects in resilience for California homes built after 1995. Using differences in code requirements across jurisdictions, they show that these vintage effects are due to state and local building code changes prompted by the deadly 1991 Oakland Firestorm. Moreover, Baylis and Boomhower find that these improvements increase the survival probability of neighboring homes due to reduced structure-to-structure spread. Their results imply that property losses during recent wildfire seasons would have been several billion dollars smaller if all older homes had been built to current standards.