Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-25T07:52:06.795Z Has data issue: false hasContentIssue false

Methodological checklists for improving research quality and reporting consistency

Published online by Cambridge University Press:  01 May 2020

Lillian T. Eby*
Affiliation:
University of Georgia
Kristen M. Shockley
Affiliation:
University of Georgia
Talya N. Bauer
Affiliation:
Portland State University
Bryan Edwards
Affiliation:
Oklahoma State University
Astrid C. Homan
Affiliation:
University of Amsterdam
Russell Johnson
Affiliation:
Michigan State University
Jonas W. B. Lang
Affiliation:
Ghent University and University of Exeter
Scott B. Morris
Affiliation:
Illinois Institute of Technology,
Frederick L. Oswald
Affiliation:
Rice University
*
*Corresponding author. Email: [email protected]

Abstract

Image of the first page of this content. For PDF version, please use the ‘Save PDF’ preceeding this image.'
Type
Commentaries
Copyright
© Society for Industrial and Organizational Psychology, Inc. 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

The third through ninth authors contributed equally and are listed alphabetically.

The ideas presented in this commentary do not reflect formal policy at any journal (e.g., Journal of Applied Psychology) or professional association (e.g., American Psychological Association).

References

APA Publications and Communications Board Working Group on Journal Article Reporting Standards. (2008). Reporting standards for research in psychology: Why do we need them? What might they be? American Psychologist, 63, 839851.CrossRefGoogle Scholar
Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73, 325.CrossRefGoogle ScholarPubMed
Banks, G. C., Rogelberg, S. G., Woznyj, H. M., Landis, R. S., & Rupp, D. E. (2016). Evidence on questionable research practices: The good, the bad, and the ugly. Journal of Business and Psychology, 31, 323338.CrossRefGoogle Scholar
Banks, G. C., O’Boyle, E. H. Jr, Pollack, J. M., White, C. D., Batchelor, J. H., Whelpley, C. E., … Adkins, C. L. (2016). Questions about questionable research practices in the field of management: A guest commentary. Journal of Management, 42, 520.CrossRefGoogle Scholar
Becker, T., Atinc, G., Breaugh, J., Carlson, K., Edwards, J., & Spector, P. (2016). Statistical control in correlational studies: 10 essential recommendations for organizational researchers. Journal of Organizational Behavior, 37(2), 157167.CrossRefGoogle Scholar
Bosco, F. A., Aguinis, H., Field, J. G., Pierce, C. A., & Dalton, D. R. (2015). HARKing’s threat to organizational research: Evidence from primary and meta-analytic sources. Personnel Psychology, 69, 709750.CrossRefGoogle Scholar
Cortina, J. M., Green, J. P., Keeler, K. R., & Vandenberg, R. J. (2017). Degrees of freedom in SEM. Organizational Research Methods, 20, 350378.CrossRefGoogle Scholar
Grand, J. A., Rogelberg, S. G., Allen, T. D., Landis, R. S., Reynolds, D. H., Scott, J. C., … Truxillo, D. M. (2018). A systems-based approach to fostering a robust science in industrial-organizational psychology. Industrial and Organizational Psychology: Perspectives on Research and Practice, 11(1), 442.CrossRefGoogle Scholar
Heggestad, E., Scheaf, D., Banks, G. C., Hausfeld, M.M., Tonidandel, S., & Williams, E. (2019). Scale adaptation in organizational science research: A review and best-practice recommendations. Journal of Management, 45, 25962627.CrossRefGoogle Scholar
Highhouse, S., & Gillespie, J. Z. (2009). Do samples really matter that much? In C. E. Lance & R. J. Vandenberg (Eds.), Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences (pp. 247–265). New York, NY: Routledge/Taylor & Francis Group.Google Scholar
Hofmann, D. A., & Gavin, M. B. (1998). Centering decisions in hierarchical linear models: Implications for research in organizations. Journal of Management, 24, 623641. doi: 10.1016/S0149-2063(99)80077-4CrossRefGoogle Scholar
Hooijmans, C. R., Leenaars, M., & Ritskes-Hoitinga, M. (2010). A gold standard publication checklist to improve the quality of animal studies, to fully integrate the Three Rs, and to make systematic reviews more feasible. Alternatives to Laboratory Animals, 38, 167182.CrossRefGoogle ScholarPubMed
Hox, J. (2002). Multilevel analysis: Techniques and applications. Mahwah, NJ: Erlbaum.CrossRefGoogle Scholar
Köhler, T., González-Morales, M. G., Banks, G. C., O’Boyle, E., Allen, J. A., Sinha, R., … Gulick, L. V. M. (2020). Supporting robust, rigorous, and reliable reviewing as the cornerstone of our profession: Introducing a competency framework for peer review. Industrial and Organizational Psychology: Perspectives on Research and Practice, 13(1), 1–27.Google Scholar
LaHuis, D. M., Hartman, M. J., Hakoyama, S., & Clark, P. C. (2014). Explained variance measures for multilevel models. Organizational Research Methods, 17, 433451.CrossRefGoogle Scholar
Lang, J. W. B., Bliese, P. D., & Runge, J. M. (in press). Detecting consensus emergence in organizational multilevel data: Power simulations. Organizational Research Methods. doi:10.1177/1094428119873950Google Scholar
Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suárez-Orozco, C. (2018). Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in Psychology: The APA Publications and Communications Board Task Force report. American Psychologist, 73(1), 2646.CrossRefGoogle ScholarPubMed
Lievens, F., & Conway, J. M. (2001). Dimension and exercise variance in assessment center scores: A large-scale evaluation of multitrait-multimethod studies. Journal of Applied Psychology, 86, 12021222.CrossRefGoogle ScholarPubMed
Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., Du Sert, N. P., … Ioannidis, J. P. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1, 19.CrossRefGoogle Scholar
Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615631.CrossRefGoogle ScholarPubMed
Sigall, H., & Mills, J. (1998). Measures of independent variables and mediators are useful in social psychology experiments: But are they necessary? Personality and Social Psychology Review, 2, 218226.CrossRefGoogle ScholarPubMed
Stevens, A., Shamseer, L., Weinstein, E., Yazdi, F., Turner, L., Thielman, J., … Schulz, K. F. (2014). Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: Systematic review. BMJ, 348, g3804.CrossRefGoogle ScholarPubMed