Big Data for Twenty-First-Century Economic Statistics

12/31/2021
Featured in print Reporter

Big Data for Twenty-first-century Economic Statistics book cover

Katharine G. Abraham, Ron S. Jarmin, Brian C. Moyer, and Matthew D. Shapiro, editors

The existing infrastructure for production of key economic statistics relies heavily on data collected through sample surveys and periodic censuses, together with administrative records generated in connection with tax administration. The increasing difficulty of obtaining survey and census responses threatens the viability of these approaches.

The growing availability of new sources of Big Data — such as scanner data on purchases, credit card transaction records, payroll information, and prices of various goods scraped from the websites of online sellers — has changed the data landscape. These new sources of data hold the promise of allowing the statistical agencies to produce more accurate, more disaggregated, and more timely economic data to meet the needs of policymakers and other data users.

This volume documents progress made toward that goal and the challenges to be overcome to realize the full potential of Big Data in the production of economic statistics. It describes the deployment of Big Data to solve both existing and novel challenges in economic measurement, and will be of interest to statistical agency staff, academic researchers, and other serious users of economic statistics.

https://press.uchicago.edu/ucp/books/book/chicago/B/bo136254067.html