Field Experiments in the Science of Science: Lessons from Peer Review and the Evaluation of New Knowledge
Scientific evaluation and peer review govern the allocation of resources and certification of knowledge in science, yet have been subjected to limited causal investigation. This chapter synthesizes randomized experiments embedded in live peer-evaluation systems at journals, conferences, and funding agencies, restricting attention to published studies. I organize this evidence using a Q–A–R–S framework that decomposes peer review into attributes of submissions (Q), authors (A), reviewers (R), and evaluation systems (S), and interpret outcomes through a view of the core problem of scientific evaluation as assessing new knowledge using the existing stock of knowledge.
The chapter treats experimental design choices as objects of analysis, assessing what existing interventions can—and cannot—identify given their designs and settings, the institutional constraints they face, and opportunities for higher-leverage experimentation. I show that randomized experimentation embedded in peer review spans the full Q–A–R–S space, albeit sparsely, and yields uneven but informative insights across different margins.
Based on the full body of evidence, I advance several novel claims: (1) system interventions often affect participant behavior with little impact on core evaluative judgments; (2) core evaluations are most clearly shaped by who reviews and their expertise; and (3) peer review functions more reliably as a “filter” of poor submissions than as a fine-grained “ranker” of acceptable submissions. Overall, the evidence points to a functioning institution operating under binding epistemic and organizational constraints, rather than to systemic failure. I identify channels for improving the speed, cost, and reliability of scientific evaluation institutions.
Substantial scope remains to redesign embedded experiments to increase inferential power, generalizability, and cumulative insight, while reducing disruption and more tightly linking to institutional innovation and policy changes.
-
Copy CitationKevin J. Boudreau, Economics of Science (University of Chicago Press, 2026), chap. 3, https://www.nber.org/books-and-chapters/economics-science/field-experiments-science-science-lessons-peer-review-and-evaluation-new-knowledge.Download Citation