Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-11-27T21:36:48.795Z Has data issue: false hasContentIssue false

Three strong moves to improve research and replications alike

Published online by Cambridge University Press:  27 July 2018

Roger Giner-Sorolla
Affiliation:
School of Psychology – Keynes College, University of Kent, Canterbury, Kent CT2 7NP, United Kingdom. [email protected]://www.kent.ac.uk/psychology/people/ginerr/
David M. Amodio
Affiliation:
Department of Psychology, New York University, New York, NY 10003. [email protected]://amodiolab.org/ Department of Social Psychology, University of Amsterdam, 1018 WS Amsterdam, The Netherlands. [email protected]://www.uva.nl/profile/g.a.vankleef/
Gerben A. van Kleef
Affiliation:
Department of Social Psychology, University of Amsterdam, 1018 WS Amsterdam, The Netherlands. [email protected]://www.uva.nl/profile/g.a.vankleef/

Abstract

We suggest three additional improvements to replication practices. First, original research should include concrete checks on validity, encouraged by editorial standards. Second, the reasons for replicating a particular study should be more transparent and balance systematic positive reasons with selective negative ones. Third, methodological validity should also be factored into evaluating replications, with methodologically inconclusive replications not counted as non-replications.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Chambers, C. D., Dienes, Z., McIntosh, R. D., Rotshtein, P. & Willmes, K. (2015) Registered reports: Realigning incentives in scientific publishing. Cortex 66:A1A2.Google Scholar
Collaborative Replications and Education Project (2018) Current study list and selection methods. Available at: https://osf.io/flaue/wiki/home/.Google Scholar
Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., Baranski, E., Bernstein, M. J., Bofiglio, D. B. V., Boucher, L., Brown, E. R., Budima, N. I., Cairo, A. H., Capaldi, C. A., Chartier, C. R., Chung, J. M., Cicero, D. C., Coleman, J. A., Conway, J. G., Davis, W. E., Devos, T., Fletcher, M. M., German, K., Grahe, J. E., Hermann, A. D., Hicks, J. A., Honeycutt, N., Humphrey, B., Janus, M., Johnson, D. J., Joy-Gaba, J. A., Juzeler, H., Keres, A., Kinney, D., Kirschenbaum, J., Klein, R. A., Lucas, R. E., Lustgraff, C. J. N., Martin, D., Menon, M., Metzger, M., Moloney, J. M., Morse, P. J., Prislin, R., Razza, T., Re, D. E., Rule, N. O., Sacco, D. F., Sauerberger, K., Shrider, E., Shultz, M., Siesman, C., Sobocko, K., Sternglanz, R. W., Summerville, A., Tskhay, K. O., van Allen, Z., Vaughn, L. A., Walker, R. J., Weinberg, A., Wilson, J. P., Wirth, J. H., Wortman, J. & Nosek, B. A. (2016a) Many Labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology 67:6882. Available at: http://doi.org/10.1016/j.jesp.2015.10.012.Google Scholar
Gilbert, D. T., King, G., Pettigrew, S, & Wilson, T. D. (2016) Comment on “estimating the reproducibility of psychological science.Science 351(6277):1037. Available at: http://doi.org/10.1126/science.aad7243.Google Scholar
Giner-Sorolla, R. (2016) Approaching a fair deal for significance and other concerns. Journal of Experimental Social Psychology 65:16.Google Scholar
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B. Jr., Bahník, S., Bernstein, M. J., Bocian, K., Brandt, M. J., Brooks, B., Brumbaugh, C. C., Cemalcilar, Z., Chandler, J., Cheong, W., Davis, W. E., Devos, T., Eisner, M., Frankowska, N., Furrow, D., Galliani, E. M., Hasselman, F., Hicks, J. A., Hovermale, J. F., Hunt, S. J., Huntsinger, J. R., IJerzman, H., John, M.-S., Joy-Gaba, J. A., Kappes, H. B., Krueger, L. E., Kurtz, J., Levitan, C. A., Mallett, R. K., Morris, W. L., Nelson, A. J., Nier, J. A., Packard, G., Pilati, R., Rutchick, A. M., Schmidt, K., Skorinko, J. L., Smith, R., Steiner, T. G., Storbeck, J., Van Swol, L. M., Thompson, D., van't Veer, A. E., Vaughn, L. A., Vranka, M., Wichman, A. L., Woodzicka, J. A. & Nosek, B. A. (2014b). Data from investigating variation in replicability: A “Many Labs” Replication Project. Journal of Open Psychology Data 2(1):e4.Google Scholar
Lakens, D. (2016) The replication value: What should be replicated? Blog post. Available at: http://daniellakens.blogspot.co.uk/2016/01/the-replication-value-what-should-be.html.Google Scholar
LeBel, E. P. & Peters, K. R. (2011) Fearing the future of empirical psychology: Bem's (2011) evidence of psi as a case study of deficiencies in modal research practice. Review of General Psychology 15(4):371–79.Google Scholar
Open Science Collaboration (2015) Estimating the reproducibility of psychological science. Science 349(6251):aac4716. Available at: http://doi.org/10.1126/science.aac4716.Google Scholar