Article contents
The replication crisis, the rise of new research practices and what it means for experimental economics
Published online by Cambridge University Press: 17 January 2025
Abstract
The replication crisis across several disciplines raises challenges for behavioural sciences in general. In this report, we review the lessons for experimental economists of these developments. We present the new research methods and practices which are being proposed to improve the replicability of scientific studies. We discuss how these methods and practices can have a positive impact in experimental economics and the extent to which they should be encouraged.
- Type
- Original Paper
- Information
- Journal of the Economic Science Association , Volume 7 , Issue 2: Special Issue: Experiments in the time of COVID 19 – Challenges and Insights (Pages 103-209) , December 2021 , pp. 210 - 225
- Copyright
- Copyright © 2021 The Author(s), under exclusive licence to Economic Science Association
References
Brodeur, A., Lé, M., Sangnier, M., Zylberberg, Y. (2016). Star wars: the empirics strike back. American Economic Journal: Applied Economics, 8(1), 1–32.Google Scholar
Brodeur, A., Lé, M., Sangnier, M., Zylberberg, Y., (2013). Star wars: the empirics strike back. IZA Discussion Paper No. 7268.CrossRefGoogle Scholar
Camerer, C. F., Dreber, A., Forsell, E., Ho, T. H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351(6280), 1433–1436. 10.1126/science.aaf0918CrossRefGoogle ScholarPubMed
Christensen, G., Miguel, E. (2018). Transparency, reproducibility, and the credibility of economics research. Journal of Economic Literature, 56(3), 920–980. 10.1257/jel.20171350CrossRefGoogle Scholar
Coffman, L. C., Niederle, M. (2015). Pre-analysis plans have limited upside, especially where replications are feasible. Journal of Economic Perspectives, 29(3), 81–98. 10.1257/jep.29.3.81CrossRefGoogle Scholar
Dufwenberg, M., Martinsson, P. (2014). Keeping researchers honest: The case for sealed-envelopesubmissions. IGIER (Innocenzo Gasparini Institute for Economic Research), 533.Google Scholar
Duyx, B., Urlings, M. J., Swaen, G. M., Bouter, L. M., Zeegers, M. P. (2017). Scientific citations favor positive results: a systematic review and meta-analysis. Journal of Clinical Epidemiology, 88, 92–101. 10.1016/j.jclinepi.2017.06.002CrossRefGoogle ScholarPubMed
Ioannidis, J. P. (2005). Why most published research findings are false. PLoS Medicine, 2(8),e124. 10.1371/journal.pmed.0020124CrossRefGoogle ScholarPubMed
Kasy, M. (2021). Of forking paths and tied hands: Selective publication of findings, and what economists should do about it. Journal of Economic Perspectives, 35(3), 175–192. 10.1257/jep.35.3.175CrossRefGoogle Scholar
Maniadis, Z., Tufano, F., List, J. A. (2015). How to make experimental economics research more reproducible: Lessons from other disciplines and a new proposal Replication in experimental economics, Emerald Group Publishing Limited.Google Scholar
Olken, B. A. (2015). Promises and perils of pre-analysis plans. Journal of Economic Perspectives, 29(3), 61–80. 10.1257/jep.29.3.61CrossRefGoogle Scholar
Simmons, J. P., Nelson, L. D., Simonsohn, U. (2018). False-positive citations. Perspectives on Psychological Science, 13(2), 255–259. 10.1177/1745691617698146CrossRefGoogle ScholarPubMed
- 17
- Cited by