Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-24T16:55:12.314Z Has data issue: false hasContentIssue false

What Curbs Frontiers Research? A Reaction to Rotolo et al.'s Article

Published online by Cambridge University Press:  19 June 2018

Edna Rabenu
Affiliation:
Schools of Behavioral Studies and Business Administration, Netanya Academic College
Aharon Tziner*
Affiliation:
Schools of Behavioral Studies and Business Administration, Netanya Academic College
*
Correspondence concerning this article should be addressed to Aharon Tziner, Schools of Behavioral Studies and Business Administration, Netanya Academic College, 1, University St., Netanya 42365, Israel. E-mail: [email protected]
Rights & Permissions [Opens in a new window]

Extract

Rotolo et al. (2018) decry the rise in use of trendy, simplistic human resource management (HRM) procedures and practices such as talent management, regardless of any solid scientific basis culled from relevant disciplines such as industrial and organizational (I-O) psychology. Furthermore, they observe a propagating spirit of anti-I-O psychology that has recently emerged and that should provoke our concern. What has ignited and fueled this reality? Correctly noted, Rotolo et al. indicate that I-O psychology academics have, over the years, lost touch with the actual preoccupying needs of managers in organizations. Instead of promoting novel fields of exploration and devising innovative tools and procedures, I-O scientists overly invest their time, energy, and ingenuity in methodological minutiae and theorizing.

Type
Commentaries
Copyright
Copyright © Society for Industrial and Organizational Psychology 2018 

Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018) decry the rise in use of trendy, simplistic human resource management (HRM) procedures and practices such as talent management, regardless of any solid scientific basis culled from relevant disciplines such as industrial and organizational (I-O) psychology. Furthermore, they observe a propagating spirit of anti-I-O psychology that has recently emerged and that should provoke our concern. What has ignited and fueled this reality? Correctly noted, Rotolo et al. indicate that I-O psychology academics have, over the years, lost touch with the actual preoccupying needs of managers in organizations. Instead of promoting novel fields of exploration and devising innovative tools and procedures, I-O scientists overly invest their time, energy, and ingenuity in methodological minutiae and theorizing.

In tandem with Rotolo et al. (Reference Rotolo, Church, Adler, Smither, Colquitt, Shull and Foster2018), we believe that only by means of what has been labeled “frontiers research”—research that addresses the reality in the field—can we reduce the gap between the academic arena and the field. Hence, the obvious question is why are there so few (10%) frontiers studies? In our humble opinion, the key reason is the publication policy of editors of leading I-O psychology journals who allow “conservative” articles to cross the publication threshold.

Indeed, not every academic has the cognitive abilities, creativity, and broadmindedness to think out of the box to get to the truth. As in any other field, some I-O psychology researchers are more talented and others are less so. However, we are convinced that many more than 10% can produce frontiers research. Then, why don't they? It would seem the reason is that academic promotion procedures do not call for it; frontier research is not an apt criterion for the advancement of researchers in the field of I-O psychology. On the contrary, scholars’ contributions to science and, consequently, their promotions seemingly are determined by the numbers of articles they have published in high-impact (A+) journals. The articles’ theoretical value and their contribution to solving problems in the field, apparently, are not a significant consideration.

We all have achievement needs to a certain degree (McClelland, Reference McClelland1965). To fulfill these needs, many academics elect to take the rational path of conducting studies that are very likely to be published. For example, among other tactics, they often choose “trendy” issues or “safe” topics that are well-established in the literature; use students as their research population, which facilitates data collection; and employ statistical manipulations to substantiate their findings (such as reducing outliers). Thus, the extant method of assessing research articles considerably contributes to scholars’ “impotence,” as it seems to require no cutting-edge, bold, or groundbreaking thinking.

We thus advocate the following:

  1. 1. Research rooted in observation of behavior in organizations and an attempt to explain the phenomena before the researchers rush to conceptualize them

  2. 2. Research conducted in its natural context despite the inherent difficulties of variables control

  3. 3. Deep integration of the accumulated knowledge, which adds a significant touch of its own

It is much more difficult to measure quality than quantity. Indeed, how to examine the researchers’ contributions, from the point of view of their articles’ quality, represents a significant challenge for the scientific community. We believe that to promote frontier-type research, the following principles of quality assessment should be employed:

First, we might ask, to what degree was the research used to significantly solve actual, “hot” problems associated with organizational HR? The more that pressing issues in the field (the “frontier”) are addressed—rather than being merely cited from other publications—the more significant and groundbreaking will be the research contribution. Moreover, we assert, that as part of the academic promotion process, assessors would be well advised to review the academic CVs (or other materials) that attest to the extent to which the candidates for promotion conducted their research in the field to solve real issues.

Furthermore, we suggest not to rely solely on academics to review and evaluate articles submitted for publication; rather, we should include in the evaluation process also academically highly educated field practitioners (such as members of the Society for Industrial and Organizational Psychology [SIOP] community, HR experts, and senior executives) who will contribute from their cumulative applied wisdom. Indeed, practitioners might well be expected to help decide how interesting, relevant, and innovative the submitted research article is from both theoretical and pragmatic perspectives. Specifically, involvement of practitioners in the evaluation process will contribute to the following:

  • Validation of relevance. The practitioners can testify to the potential contribution of the research article to the field. The point to be stressed is that studies produced only for the sake of publication and lacking a significant statement, belief or assertion, and/or positive contribution to the field, will not be considered for publication any further. However, we do not propose to put a damper on basic research but rather to use the joint judgment of researchers and practitioners that phenomena are not only intriguing to be studied but also might have real-world applications. Combining different sources of judgment has good prospects to prove resourceful for advancement of an impactful organizational psychology.Footnote 1

  • Swifter dissemination of scientific knowledge. Academic knowledge will move more swiftly from the academic ivory tower to the field, in acknowledgment of Grand et al.’s (Reference Grand, Rogelberg, Allen, Landis, Reynolds, Scott and Truxillo2017) call for “recognition that science is a public good and thus should be readily available for the benefit of everyone” (p. 15).

  • Bridge between academia and the field. The exposure of academic activity to the world of the practitioners may be expected to produce more, much needed communication and cooperation between academia and the field in order to jumpstart research forward. Today, many research studies are based on questionnaires filled out by students because of convenience but also, no less, because of the difficulties of harnessing the field to research. Exposing practitioners to the field of research will encourage them to promote collaboration between their organizations and the academic world in such undertakings as surveys and quasi-experimentation. One such example of this kind of collaboration is the Hawthorne plant that, in conjunction with Professor Elton Mayo of Harvard University, completely changed the management approach toward its workers from a mechanistic to a humanistic mode.

  • Access to the practitioners’ prism. If we examine articles that were written by individuals from the field concerning the assessment of organizational functioning (such as Buckingham & Goodall, Reference Buckingham and Goodall2015; Goler, Reference Goler2015; Goler, Gale, & Grant, Reference Goler, Gale and Grant2016; Rowland, Reference Rowland2016), we note that some of these practitioners clearly grasped and understood the core difficulties and formulated solutions that were no less constructive than those provided by the academics. The practitioners’ prism is very important—if not critical—for the advancement of science.

  • Feedback for the judges. Pertinently, the reviewers from the field (i.e., from tangible work organizations) will also be able to receive feedback on the quality of their evaluations: The more frequently field practitioners’ assessments promote the acceptance of submitted articles for publication—and the articles are, in fact, finally published—so will the field representatives more likely attain “compensation” in the form of prestige (if not by some more material manner such as a free subscriptions to the journals in question).

  • A learning tool for researchers. In this way, researchers, as authors of journal articles, will also be able to observe their learning curve. Researchers who are rejected time and again due to low innovation or lack of public interest will either be inclined to cease sending articles (and consequently their numbers will decrease) or they will progress significantly to upgrade the quality of their research and its consequent presentation.

In summary, the current process of research article evaluation curbs frontier research and encourages conservatism and marginal progress; it is a sure way, however, to foster the publication of conventional and conformist research articles. The many published articles do perhaps guarantee numerous references and subsequent promotion, but they ultimately widen the chasm between the academic arena and the field, largely because organizations’ needs (i.e., the dilemmas that concern them) are not met. Only by changing the evaluation process of research articles and including the wisdom of practitioners in the process (namely, the “wisdom of the crowd”) will science return to its significant role of charting the path for the field. If not, academia will be left behind and lose its standing as the shining beacon of the quest for the truth.

Footnotes

1 This paragraph was added in response to a reviewer's comment, for which we are grateful.

References

Buckingham, M., & Goodall, A. (2015). Reinventing performance management. Harvard Business Review, 93 (4), 4050.Google Scholar
Goler, L. (2015). What Facebook knows about engaging millennial employees. Harvard Business Review, 93(December). Retrieved from https://hbr.org/2015/12/what-facebook-knows-about-engaging-millennial-employeesGoogle Scholar
Goler, L., Gale, J., & Grant, A. (2016). Let's not kill performance evaluations yet. Harvard Business Review, 94 (11), 9094.Google Scholar
Grand, J. A., Rogelberg, S. G., Allen, T. D., Landis, R. S., Reynolds, D. H., Scott, J. C., . . . Truxillo, D. M. (2017). A systems-based approach to fostering robust science in industrial-organizational psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice, 11 (1), 442.Google Scholar
McClelland, D. C. (1965). Toward a theory of motive acquisition. American Psychologist, 20 (5), 321333.CrossRefGoogle Scholar
Rotolo, C.T., Church, A. H., Adler, S., Smither, J.W., Colquitt, A., Shull, A.C., . . . Foster, G. (2018). Putting an end to bad talent management: A call to action for the field of I-O psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice, 11 (2), 176–219.CrossRefGoogle Scholar
Rowland, D. (2016, October 14). Why leadership development isn't developing leaders. Harvard Business Review. Retrieved from https://hbr.org/2016/10/why-leadership-development-isnt-developing-leaders?referral=03758&cm_vc=rr_item_page.top_rightGoogle Scholar