Incentives for Replication in Economics
Replication is a critical component of scientific credibility as it increases our confidence in the reliability of the knowledge generated by original research. Yet, replication is the exception rather than the rule in economics. In this paper, we examine why replication is so rare and propose changes to the incentives to replicate. Our study focuses on software code replication, which seeks to replicate the results in the original paper using the same data as the original study and verifying that the analysis code is correct. We analyse the effectiveness of the current model for code replication in the context of three desirable characteristics: unbiasedness, fairness and efficiency. We find substantial evidence of “overturn bias” that likely leads to many false positives in terms of “finding” or claiming mistakes in the original analysis. Overturn bias comes from the fact that replications that overturn original results are much easier to publish than those that confirm original results. In a survey of editors, almost all responded they would in principle publish a replication study that overturned the results of the original study, but only 29% responded that they would consider publishing a replication study that confirmed the original study results. We also find that most replication effort is devoted to so called important papers and that the cost of replication is high in that posited data and software are very hard to use. We outline a new model for the journals to take over replication post acceptance and prepublication that would solve the incentive problems raised in this paper.
We gratefully acknowledge funding for this research from the Berkeley Initiative for Transparency in the Social Sciences, a program of the Center for Effective Global Action (CEGA), with support from the Laura and John Arnold Foundation. The paper has also benefited from comments by Abhijit Banerjee, Annette Brown, Rob Jensen, Temina Madon, Ted Miguel, Don Moore, Emily Oster, Jennifer Sturdy, Sarah White, Benjamin Wood and participants in the 2016 BITSS annual meeting. Ada Kwan and Alexandra Wall provided excellent research assistance. The authors have no material or financial interests in the results of the paper. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research.