We commend Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) for introducing a new lens for viewing the well-known gap between industrial and organizational (I-O) psychology research and human resource (HR) practices in organizations. However, Rotolo et al.’s characterization of practitioner behavior as “anti I-O” suggests a particularly negative view of scientific research among some HR practitioners. The label implies that some HR practitioners are intentionally ignoring or actively resisting academic research. More likely, the behavior stems from a passive indifference to academia, which may be the appropriate attitude for some practitioners to adopt when a great deal of academic research is too slow, too theoretical, and too cryptically communicated to be useful in applied settings. We agree with Rotolo et al. when they say, “we are a discipline that is not geared for being cutting edge” (p. 182), and we appreciate their recommendations for addressing this lack of relevance. However, most recommendations in this broader discussion do not address the foundational problem within our field: a systemic mismatch between the incentives of practitioners and academics. To support this point, we briefly describe a typology of I-O psychologists as well as the varying contexts and incentives that drive their behavior. We then close with our own recommendations for how academia can improve its relevance to practitioners and close the gap. These changes are not easy, but we agree with Rotolo and colleagues that if any field can address such foundational problems, it is ours.
A Typology of Organizational Psychologists
Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) characterize organizational psychologists as “scientist–practitioners.” This popular term represents a specific location on a research–practice continuum at which attention to research and application are carefully balanced. This may represent a macro property of our field, but, in our experience, it does not adequately capture the diversity of our field at the micro level (i.e., the specific contexts in which individual researchers and practitioners operate). Instead, expanding upon a typology briefly described by Rupp and Beal (Reference Rupp and Beal2007), we see at least four distinct patterns in our I-O psychology colleagues: pure scientists, scientist–practitioners, practitioner–scientists, and pure practitioners.
The Pure Scientist
Pure scientists in I-O psychology are traditional academics who want to understand and explain the causes of human behavior within the workplace and organizations. They are incentivized primarily to publish in prestigious journals, earn tenure, and obtain grants to support their programs of research. They likely serve as editors or associate editors for one or more research-oriented journals. To these individuals, advancing theory and pursuing “knowledge for knowledge's sake” are legitimate endeavors. Typically, they are not focused on, nor rewarded for, researching topics addressing emerging HR trends, developing tools useful for practitioners, or expounding on the practical implications of their research.
The Scientist–Practitioner
Scientist–practitioners are also predominantly, though not exclusively, found in universities. They are researchers, first and foremost, but they care deeply about having real-world impact and often partner with HR professionals and/or I-O psychology colleagues in organizations to develop and implement evidence-based HR products and procedures. These individuals can still build successful academic careers through high-quality publications on practically oriented topics. However, because they are employed in universities, they are subject to the same incentive system as pure scientists, and those requirements can create tension with their applied goals. Scientist–practitioners located in applied research institutions typically conduct practice-oriented research, and although they may periodically publish in academic journals, they are not constrained by the same demands as university academics.
The Practitioner–Scientist
Practitioner–scientists typically live in large for-profit organizations, but they can also be found in large external consulting firms, large nonprofit organizations, and a variety of state and federal government roles. They primarily focus on adding value to organizations by advising on human resource strategies and implementing research-based practices. However, they are personally motivated to contribute expertise to the collective knowledge pool through publications, either in academic journals and books. Additionally, they may collaborate with academics on research projects. They tend to invest extra-role efforts to maintain familiarity with current research and strive to implement best practices consistent with research findings.
The Pure Practitioner
Pure practitioners are usually employed by the same organizations as practitioner–scientists. They are driven almost exclusively by organizational needs. They are either uninterested in or too constrained by time and resources to read academic research, but they may learn from practitioner-focused outlets such as Harvard Business Review. These individuals may feel that pure science is disconnected from the realities they face every day in real organizations. They may focus more on trending topics and on signaling their value to organizational leaders than on the evidentiary support of their work. They tend to rely on a network of vendors to provide best practices rather than recent research findings.
Systemic Pressures
As Garman (Reference Garman2011) notes regarding academia and practice, “within the two contexts, success is defined very differently” (p. 130). Pure scientists and scientist–practitioners are driven by publication standards imposed by journals and the requirements of their university tenure systems, whereas pure practitioners and practitioner–scientists must ultimately add and demonstrate value to organizations. Thus, although the personal motivations of I-O psychologists are numerous, the structural incentives are bimodal. Earnestly honoring both sides, as the scientist–practitioners and practitioner–scientists desire, often requires additional work for the same, or even less, reward. Many commentaries on the science–practice divide have been written from both sides; however, no amount of finger wagging in either direction will engender meaningful change. The gap between research and practice is the inevitable result of the disconnected incentive structures that scientists and practitioners face. Any changes that do not address this fundamental incongruity will continue to allow “anti I-O” practices to survive. For true change to occur, the incentive structure of the field must change.
Other Barriers to the Scientist–Practitioner Ideal
In addition to differing incentive systems, the scientist–practitioner gap is exacerbated by two characteristics of academia that we believe Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) underemphasize: (a) an overreliance on theory and theoretical contribution and (b) a mismatch between publication timelines and organizational interests. These factors often prevent scientists from doing the research on emerging and frontier topics that Rotolo et al. espouse.
Overreliance on Theory
Rotolo et al. stress the importance of theoretical alignment on emerging topics. Although this may be desirable in and of itself, academic overreliance on theory often prevents research on practical and emerging topics. The increasing valuation of “theoretical contribution” in organizational science is undeniable. For example, Cucina and Moriarty (Reference Cucina and Moriarty2015) showed how two of our top journals, Journal of Applied Psychology and Personnel Psychology, have transformed from practice-oriented to strongly theory-oriented outlets over the past few decades and in ways not seen in other prominent outlets (e.g., American Psychologist and Psychological Bulletin, as well as scientific gold standards such as Nature and Science). Hambrick (Reference Hambrick2007) speculated that this trend in management science stems from academic insecurity, which itself may have arisen from external criticism of the academic sophistication of business schools.
Theory is undeniably useful (e.g., for explaining, predicting, synthesizing, and preventing rash conclusions based on anomalous findings; Miller, Reference Miller2007) but the pendulum has, perhaps, swung too far toward deduction. Requiring that all published research be firmly grounded in existing theory and make a substantial theoretical contribution inhibits publication of important, frontier, and rigorous but atheoretical findings. Under such conditions, research on emerging topics and new technologies that practitioners want can be prohibitively risky, particularly for assistant professors seeking tenure. Allowing rigorous but atheoretical research into our established literature may allow a more rapid response to the needs of practitioners and provide the basis for the development of future theory.
Temporal Mismatch
Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) lament a difference in the speed of science and practice. Namely, academia progresses at a much slower rate than practice demands, and we echo this concern. It can take years for even simple research to go through theoretical conception, data collection, analysis, manuscript writing, and potentially multiple rounds of revisions. Often, reviewers try to remake papers in their preferred image, requiring increasingly extensive and time-consuming changes, which can result in three or more revisions at a single journal with no guarantee of acceptance. The review process itself can significantly slow the time it takes for research to reach publication. Other work has shown that publication time across science has been consistent over the last 30 years but that social science publication times lag behind the natural sciences (Powell, Reference Powell2016). This is particularly problematic in I-O psychology because those slower publication times are not improving, while at the same time the organizations we study are moving at an ever-faster rate. This pace of change exacerbates the science–practice gap as science falls ever further behind. In the time required to research and publish a study on a new topic, an organization may try multiple approaches to the same problem, discard the approaches that do not work, and keep the ones that do (or appear to). In addition, this slow publication system only leads to results in the aggregate. Thus, we may know that a procedure does not work well for organizations in general but have little evidence on how it works in specific organizations. This lack of contextualization further incentivizes practitioners to experiment independently.
We believe that caution is warranted as well: Science should be slower than “popular” advances to ensure rigor. Science strives for cumulative theory building rather than anecdotal evidence, and the former necessarily takes more time. However, whereas hastening research could hypothetically compromise quality, the current system is far from this danger.
Recommendations
We recommend three key changes to the academic system that can help close the gap between science and practice. We limit our recommendations to academia for two reasons: (a) Incentives in academia are more homogenous than those in industry and are, therefore, more conducive to broad prescriptions, and (b) we are not practitioners, so constructively criticizing their idiosyncratic systems could be irresponsible. However, we encourage practitioners to take a deeper look at their side of the divide and consider similar ways to restructure their incentives.
1. Reduce Our Reliance on Theory
Although it is perfectly natural for academics to be more interested in explaining phenomena and for practitioners to be more interested in predicting, research need not stay on the theoretical end of the spectrum. Not all studies need to be directly tied to a specific organizational concern; there is room for basic psychological research in our field. However, allowing academic insecurity to drive our research questions is unlikely to yield meaningful cumulation. Our recommendation aligns with those of Hambrick (Reference Hambrick2007) and Miller (Reference Miller2007). Journals should swing the pendulum back toward the center by rewarding practically significant studies with high probabilities of stimulating meaningful future research and loosening requirements for novel theoretical contribution. Any well-executed study that moves toward better understanding of consequential phenomena in organizations deserves a chance at publication, even without a significant theoretical contribution.
2. Improve the Speed of Our Pipelines to Production
Expediting publication of quality research will help academia meet practitioners’ needs more effectively. Although the average time from acceptance to publication has fallen due to technology (Powell, Reference Powell2016), the transition from physical mail to electronic correspondence has not decreased review times. Journals should seek ways in which the same technological advancements that improve time from acceptance to publication would shorten the review process itself.
We recommend that reviewers and editors return to the predominant role they played in earlier days of our science. That is, they should function primarily as gatekeepers of academic rigor, weeding out poor science and moving quality work toward publication as quickly and with as little tampering as possible. Subjective judgments of quality and contribution are unavoidable and often desirable, but remaking articles through drawn-out revision processes is often counterproductive. This is especially true when the primary purpose of an article is data analysis and presentation. In those cases, the primary judgment made by reviewers and editors should regard the rigor of the study design and appropriateness of the analyses. If the study satisfies those criteria and has either theoretical or practical utility, the article should be moved rapidly toward publication.
Finally, although busy schedules may hinder tight turnarounds both for authors and reviewers, limiting the scope of revisions is a reasonable compromise. We recommend that editors avoid multiple rounds of revisions, which can add additional months, or even years, to the review process. In particular, action editors should strive to make clear, swift decisions on a submission's potential contribution and simply reject submissions that require too much alteration. Powell (Reference Powell2016) describes a relatively new open access journal in biomedical science called eLife that has adopted a strategy of either reviewing submissions quickly or rejecting them. The strategy includes quick initial decisions, single rounds of revision whenever possible, and limited requests for additional analyses (i.e., two months maximum). The current impact factor for eLife is 7.725, which is not particularly high for biomedical sciences but sizeable by our standards.
Allowing any well-executed study into public knowledge quickly could have several positive effects on our science. For example, it would likely reduce the so called “file-drawer” effect where valid data collections never see publication, either because journals reject them or because they are never submitted. Expediting parts of the publication process without compromising on quality will allow for the greater and faster cumulation of knowledge in our field. It will also allow academia to more nimbly respond to the needs of real organizations by providing them with sound science in a relatively timely manner.
3. Change Incentive Structures to Reward a Wider Range of Work
Academic incentives must change to encourage the cutting-edge research that Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) say is lacking. A recent symposium published in Perspectives on Psychological Science on measuring merit in academia speaks directly to this point. Specifically, Lubart and Mouchiroud (Reference Lubart and Mouchiroud2017) propose that psychology more generally should move beyond the common measures of merit, such as the popular h-index (Hirsch, Reference Hirsch2005), to include measures of transmission, originality, usefulness, and generativity. For example, they view usefulness as developing “valuable new tools or practices in an applied setting” (Hirsch, Reference Hirsch2005, p. 1160). Their examples include publishing in practitioner magazines, recommendation reports for practitioners, citations in the media, and invited talks at practitioner conferences. Under our field's usual incentive structure, these outlets are not explicitly encouraged or rewarded.
Beyond individual metrics of success, expanding and altering the calculation of journal impact factors could facilitate our first two recommendations. Right now, journals are only incentivized to publish articles with high probabilities of citation. This exacerbates the file-drawer effect by limiting the range of article types. We cannot expect meaningful changes in publication without changes to macro incentives as well.
Expanding how our field defines meritorious contributions is the only way to bridge the science–practice gap in a meaningful way. We are under no illusion that this represents an easy change to make. There are overriding pressures on I-O psychology departments (and our partners in management, HR, and other related fields) from the larger university systems within which they exist. However, as evidenced by the broader discussion regarding merit in the psychological sciences, growing awareness may help bring positive change to the field of which I-O psychology is a part (e.g., Lubart & Mouchiroud, Reference Lubart and Mouchiroud2017). Although it will be difficult, this systemic change could free scientist–practitioners to move back toward the center of the science–practice continuum and do the kind of work that would help our field bridge the gap (e.g., designing practical tools and translating science in practitioner-friendly outlets).
Conclusion
As technology accelerates and forces organizations to adapt, the gap between science and practice will likely widen without systemic changes and allow “anti I-O” practices to proliferate. We are cautious not to paint a picture of gloom for the future of our field, as we do believe that our science can address the problems at hand and remain relevant for organizations far into the future. However, remaining relevant requires that we, as a field, be willing to take a realistic and hard look at ourselves and be proactive about making the changes that such self-reflection suggests. The time to start making those changes is now. We hope that our perspective and recommendations can contribute to a much larger conversation about practical ways to ensure a bright future for I-O psychology.
We commend Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) for introducing a new lens for viewing the well-known gap between industrial and organizational (I-O) psychology research and human resource (HR) practices in organizations. However, Rotolo et al.’s characterization of practitioner behavior as “anti I-O” suggests a particularly negative view of scientific research among some HR practitioners. The label implies that some HR practitioners are intentionally ignoring or actively resisting academic research. More likely, the behavior stems from a passive indifference to academia, which may be the appropriate attitude for some practitioners to adopt when a great deal of academic research is too slow, too theoretical, and too cryptically communicated to be useful in applied settings. We agree with Rotolo et al. when they say, “we are a discipline that is not geared for being cutting edge” (p. 182), and we appreciate their recommendations for addressing this lack of relevance. However, most recommendations in this broader discussion do not address the foundational problem within our field: a systemic mismatch between the incentives of practitioners and academics. To support this point, we briefly describe a typology of I-O psychologists as well as the varying contexts and incentives that drive their behavior. We then close with our own recommendations for how academia can improve its relevance to practitioners and close the gap. These changes are not easy, but we agree with Rotolo and colleagues that if any field can address such foundational problems, it is ours.
A Typology of Organizational Psychologists
Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) characterize organizational psychologists as “scientist–practitioners.” This popular term represents a specific location on a research–practice continuum at which attention to research and application are carefully balanced. This may represent a macro property of our field, but, in our experience, it does not adequately capture the diversity of our field at the micro level (i.e., the specific contexts in which individual researchers and practitioners operate). Instead, expanding upon a typology briefly described by Rupp and Beal (Reference Rupp and Beal2007), we see at least four distinct patterns in our I-O psychology colleagues: pure scientists, scientist–practitioners, practitioner–scientists, and pure practitioners.
The Pure Scientist
Pure scientists in I-O psychology are traditional academics who want to understand and explain the causes of human behavior within the workplace and organizations. They are incentivized primarily to publish in prestigious journals, earn tenure, and obtain grants to support their programs of research. They likely serve as editors or associate editors for one or more research-oriented journals. To these individuals, advancing theory and pursuing “knowledge for knowledge's sake” are legitimate endeavors. Typically, they are not focused on, nor rewarded for, researching topics addressing emerging HR trends, developing tools useful for practitioners, or expounding on the practical implications of their research.
The Scientist–Practitioner
Scientist–practitioners are also predominantly, though not exclusively, found in universities. They are researchers, first and foremost, but they care deeply about having real-world impact and often partner with HR professionals and/or I-O psychology colleagues in organizations to develop and implement evidence-based HR products and procedures. These individuals can still build successful academic careers through high-quality publications on practically oriented topics. However, because they are employed in universities, they are subject to the same incentive system as pure scientists, and those requirements can create tension with their applied goals. Scientist–practitioners located in applied research institutions typically conduct practice-oriented research, and although they may periodically publish in academic journals, they are not constrained by the same demands as university academics.
The Practitioner–Scientist
Practitioner–scientists typically live in large for-profit organizations, but they can also be found in large external consulting firms, large nonprofit organizations, and a variety of state and federal government roles. They primarily focus on adding value to organizations by advising on human resource strategies and implementing research-based practices. However, they are personally motivated to contribute expertise to the collective knowledge pool through publications, either in academic journals and books. Additionally, they may collaborate with academics on research projects. They tend to invest extra-role efforts to maintain familiarity with current research and strive to implement best practices consistent with research findings.
The Pure Practitioner
Pure practitioners are usually employed by the same organizations as practitioner–scientists. They are driven almost exclusively by organizational needs. They are either uninterested in or too constrained by time and resources to read academic research, but they may learn from practitioner-focused outlets such as Harvard Business Review. These individuals may feel that pure science is disconnected from the realities they face every day in real organizations. They may focus more on trending topics and on signaling their value to organizational leaders than on the evidentiary support of their work. They tend to rely on a network of vendors to provide best practices rather than recent research findings.
Systemic Pressures
As Garman (Reference Garman2011) notes regarding academia and practice, “within the two contexts, success is defined very differently” (p. 130). Pure scientists and scientist–practitioners are driven by publication standards imposed by journals and the requirements of their university tenure systems, whereas pure practitioners and practitioner–scientists must ultimately add and demonstrate value to organizations. Thus, although the personal motivations of I-O psychologists are numerous, the structural incentives are bimodal. Earnestly honoring both sides, as the scientist–practitioners and practitioner–scientists desire, often requires additional work for the same, or even less, reward. Many commentaries on the science–practice divide have been written from both sides; however, no amount of finger wagging in either direction will engender meaningful change. The gap between research and practice is the inevitable result of the disconnected incentive structures that scientists and practitioners face. Any changes that do not address this fundamental incongruity will continue to allow “anti I-O” practices to survive. For true change to occur, the incentive structure of the field must change.
Other Barriers to the Scientist–Practitioner Ideal
In addition to differing incentive systems, the scientist–practitioner gap is exacerbated by two characteristics of academia that we believe Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) underemphasize: (a) an overreliance on theory and theoretical contribution and (b) a mismatch between publication timelines and organizational interests. These factors often prevent scientists from doing the research on emerging and frontier topics that Rotolo et al. espouse.
Overreliance on Theory
Rotolo et al. stress the importance of theoretical alignment on emerging topics. Although this may be desirable in and of itself, academic overreliance on theory often prevents research on practical and emerging topics. The increasing valuation of “theoretical contribution” in organizational science is undeniable. For example, Cucina and Moriarty (Reference Cucina and Moriarty2015) showed how two of our top journals, Journal of Applied Psychology and Personnel Psychology, have transformed from practice-oriented to strongly theory-oriented outlets over the past few decades and in ways not seen in other prominent outlets (e.g., American Psychologist and Psychological Bulletin, as well as scientific gold standards such as Nature and Science). Hambrick (Reference Hambrick2007) speculated that this trend in management science stems from academic insecurity, which itself may have arisen from external criticism of the academic sophistication of business schools.
Theory is undeniably useful (e.g., for explaining, predicting, synthesizing, and preventing rash conclusions based on anomalous findings; Miller, Reference Miller2007) but the pendulum has, perhaps, swung too far toward deduction. Requiring that all published research be firmly grounded in existing theory and make a substantial theoretical contribution inhibits publication of important, frontier, and rigorous but atheoretical findings. Under such conditions, research on emerging topics and new technologies that practitioners want can be prohibitively risky, particularly for assistant professors seeking tenure. Allowing rigorous but atheoretical research into our established literature may allow a more rapid response to the needs of practitioners and provide the basis for the development of future theory.
Temporal Mismatch
Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) lament a difference in the speed of science and practice. Namely, academia progresses at a much slower rate than practice demands, and we echo this concern. It can take years for even simple research to go through theoretical conception, data collection, analysis, manuscript writing, and potentially multiple rounds of revisions. Often, reviewers try to remake papers in their preferred image, requiring increasingly extensive and time-consuming changes, which can result in three or more revisions at a single journal with no guarantee of acceptance. The review process itself can significantly slow the time it takes for research to reach publication. Other work has shown that publication time across science has been consistent over the last 30 years but that social science publication times lag behind the natural sciences (Powell, Reference Powell2016). This is particularly problematic in I-O psychology because those slower publication times are not improving, while at the same time the organizations we study are moving at an ever-faster rate. This pace of change exacerbates the science–practice gap as science falls ever further behind. In the time required to research and publish a study on a new topic, an organization may try multiple approaches to the same problem, discard the approaches that do not work, and keep the ones that do (or appear to). In addition, this slow publication system only leads to results in the aggregate. Thus, we may know that a procedure does not work well for organizations in general but have little evidence on how it works in specific organizations. This lack of contextualization further incentivizes practitioners to experiment independently.
We believe that caution is warranted as well: Science should be slower than “popular” advances to ensure rigor. Science strives for cumulative theory building rather than anecdotal evidence, and the former necessarily takes more time. However, whereas hastening research could hypothetically compromise quality, the current system is far from this danger.
Recommendations
We recommend three key changes to the academic system that can help close the gap between science and practice. We limit our recommendations to academia for two reasons: (a) Incentives in academia are more homogenous than those in industry and are, therefore, more conducive to broad prescriptions, and (b) we are not practitioners, so constructively criticizing their idiosyncratic systems could be irresponsible. However, we encourage practitioners to take a deeper look at their side of the divide and consider similar ways to restructure their incentives.
1. Reduce Our Reliance on Theory
Although it is perfectly natural for academics to be more interested in explaining phenomena and for practitioners to be more interested in predicting, research need not stay on the theoretical end of the spectrum. Not all studies need to be directly tied to a specific organizational concern; there is room for basic psychological research in our field. However, allowing academic insecurity to drive our research questions is unlikely to yield meaningful cumulation. Our recommendation aligns with those of Hambrick (Reference Hambrick2007) and Miller (Reference Miller2007). Journals should swing the pendulum back toward the center by rewarding practically significant studies with high probabilities of stimulating meaningful future research and loosening requirements for novel theoretical contribution. Any well-executed study that moves toward better understanding of consequential phenomena in organizations deserves a chance at publication, even without a significant theoretical contribution.
2. Improve the Speed of Our Pipelines to Production
Expediting publication of quality research will help academia meet practitioners’ needs more effectively. Although the average time from acceptance to publication has fallen due to technology (Powell, Reference Powell2016), the transition from physical mail to electronic correspondence has not decreased review times. Journals should seek ways in which the same technological advancements that improve time from acceptance to publication would shorten the review process itself.
We recommend that reviewers and editors return to the predominant role they played in earlier days of our science. That is, they should function primarily as gatekeepers of academic rigor, weeding out poor science and moving quality work toward publication as quickly and with as little tampering as possible. Subjective judgments of quality and contribution are unavoidable and often desirable, but remaking articles through drawn-out revision processes is often counterproductive. This is especially true when the primary purpose of an article is data analysis and presentation. In those cases, the primary judgment made by reviewers and editors should regard the rigor of the study design and appropriateness of the analyses. If the study satisfies those criteria and has either theoretical or practical utility, the article should be moved rapidly toward publication.
Finally, although busy schedules may hinder tight turnarounds both for authors and reviewers, limiting the scope of revisions is a reasonable compromise. We recommend that editors avoid multiple rounds of revisions, which can add additional months, or even years, to the review process. In particular, action editors should strive to make clear, swift decisions on a submission's potential contribution and simply reject submissions that require too much alteration. Powell (Reference Powell2016) describes a relatively new open access journal in biomedical science called eLife that has adopted a strategy of either reviewing submissions quickly or rejecting them. The strategy includes quick initial decisions, single rounds of revision whenever possible, and limited requests for additional analyses (i.e., two months maximum). The current impact factor for eLife is 7.725, which is not particularly high for biomedical sciences but sizeable by our standards.
Allowing any well-executed study into public knowledge quickly could have several positive effects on our science. For example, it would likely reduce the so called “file-drawer” effect where valid data collections never see publication, either because journals reject them or because they are never submitted. Expediting parts of the publication process without compromising on quality will allow for the greater and faster cumulation of knowledge in our field. It will also allow academia to more nimbly respond to the needs of real organizations by providing them with sound science in a relatively timely manner.
3. Change Incentive Structures to Reward a Wider Range of Work
Academic incentives must change to encourage the cutting-edge research that Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) say is lacking. A recent symposium published in Perspectives on Psychological Science on measuring merit in academia speaks directly to this point. Specifically, Lubart and Mouchiroud (Reference Lubart and Mouchiroud2017) propose that psychology more generally should move beyond the common measures of merit, such as the popular h-index (Hirsch, Reference Hirsch2005), to include measures of transmission, originality, usefulness, and generativity. For example, they view usefulness as developing “valuable new tools or practices in an applied setting” (Hirsch, Reference Hirsch2005, p. 1160). Their examples include publishing in practitioner magazines, recommendation reports for practitioners, citations in the media, and invited talks at practitioner conferences. Under our field's usual incentive structure, these outlets are not explicitly encouraged or rewarded.
Beyond individual metrics of success, expanding and altering the calculation of journal impact factors could facilitate our first two recommendations. Right now, journals are only incentivized to publish articles with high probabilities of citation. This exacerbates the file-drawer effect by limiting the range of article types. We cannot expect meaningful changes in publication without changes to macro incentives as well.
Expanding how our field defines meritorious contributions is the only way to bridge the science–practice gap in a meaningful way. We are under no illusion that this represents an easy change to make. There are overriding pressures on I-O psychology departments (and our partners in management, HR, and other related fields) from the larger university systems within which they exist. However, as evidenced by the broader discussion regarding merit in the psychological sciences, growing awareness may help bring positive change to the field of which I-O psychology is a part (e.g., Lubart & Mouchiroud, Reference Lubart and Mouchiroud2017). Although it will be difficult, this systemic change could free scientist–practitioners to move back toward the center of the science–practice continuum and do the kind of work that would help our field bridge the gap (e.g., designing practical tools and translating science in practitioner-friendly outlets).
Conclusion
As technology accelerates and forces organizations to adapt, the gap between science and practice will likely widen without systemic changes and allow “anti I-O” practices to proliferate. We are cautious not to paint a picture of gloom for the future of our field, as we do believe that our science can address the problems at hand and remain relevant for organizations far into the future. However, remaining relevant requires that we, as a field, be willing to take a realistic and hard look at ourselves and be proactive about making the changes that such self-reflection suggests. The time to start making those changes is now. We hope that our perspective and recommendations can contribute to a much larger conversation about practical ways to ensure a bright future for I-O psychology.