Hostname: page-component-586b7cd67f-vdxz6 Total loading time: 0 Render date: 2024-11-24T04:58:50.108Z Has data issue: false hasContentIssue false

Survey Key Driver Analysis: Are We Driving Down the Right Road?

Published online by Cambridge University Press:  17 April 2017

Jeffrey M. Cucina*
Affiliation:
U.S. Customs and Border Protection, Washington, DC
Philip T. Walmsley
Affiliation:
U.S. Customs and Border Protection, Washington, DC
Ilene F. Gast
Affiliation:
U.S. Customs and Border Protection, Washington, DC
Nicholas R. Martin
Affiliation:
Aon Consulting, Washington, DC
Patrick Curtin
Affiliation:
National Science Foundation, Arlington, Virginia
*
Correspondence concerning this article should be addressed to Jeffrey M. Cucina, 1400 L Street, NW, 7th Floor, Washington, DC 20229–1145. E-mail: [email protected]

Abstract

One of the typical roles of industrial–organizational (I-O) psychologists working as practitioners is administering employee surveys measuring job satisfaction/engagement. Traditionally, this work has involved developing (or choosing) the items for the survey, administering the items to employees, analyzing the data, and providing stakeholders with summary results (e.g., percentages of positive responses, item means). In recent years, I-O psychologists moved into uncharted territory via the use of survey key driver analysis (SKDA), which aims to identify the most critical items in a survey for action planning purposes. Typically, this analysis involves correlating (or regressing) a self-report criterion item (e.g., “considering everything, how satisfied are you with your job”) with (or on) each of the remaining survey items in an attempt to identify which items are “driving” job satisfaction/engagement. It is also possible to use an index score (i.e., a scale score formed from several items) as the criterion instead of a single item. That the criterion measure (regardless of being a single item or an index) is internal to the survey from which predictors are drawn distinguishes this practice from linkage research. This methodology is not widely covered in survey methodology coursework, and there are few peer-reviewed articles on it. Yet, a number of practitioners are marketing this service to their clients. In this focal article, a group of practitioners with extensive applied survey research experience uncovers several methodological issues with SKDA. Data from a large multiorganizational survey are used to back up claims about these issues. One issue is that SKDA ignores the psychometric reality that item standard deviations impact which items are chosen as drivers. Another issue is that the analysis ignores the factor structure of survey item responses. Furthermore, conducting this analysis each time a survey is administered conflicts with the lack of situational and temporal specificity. Additionally, it is problematic to imply causal relationships from the correlational data seen in most surveys. Most surprisingly, randomly choosing items out of a hat yields validities similar to those from conducting the analysis. Thus, we recommend that survey providers stop conducting SKDA until they can produce science that backs up this practice. These issues, in concert with the lack of literature examining the practice, make rigorous evaluations of SKDA a timely inquiry.

Type
Focal Article
Copyright
Copyright © Society for Industrial and Organizational Psychology 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Ilene F. Gast is now retired.

The views expressed in this article are those of the authors and do not necessarily reflect the views of U.S. Customs and Border Protection, the National Science Foundation, or the U.S. federal government. Portions of this article were presented at the 2011 and 2012 meetings of the Society for Industrial and Organizational Psychology and the January 2013 meeting of the U.S. Office of Personnel Management's FedPsych Forum.

References

Alonso, A., & Wang, M. (2013). International practice forum. The Industrial–Organizational Psychologist, 50 (3), 9094.Google Scholar
Barber, A. E., & Roehling, M. V. (1993). Job postings and the decision to interview: A verbal protocol analysis. Journal of Applied Psychology, 78 (5), 845856.Google Scholar
Berry, J. (2012). Guide for interpreting and action on Federal Employee Viewpoint Survey results. Washington, DC: U.S. Office of Personnel Management. Retrieved from http://www.chcoc.gov/transmittals/TransmittalDetails.aspx?TransmittalID=5175 Google Scholar
Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York, NY: Guilford.Google Scholar
Cambridge University Press. (2016). Cambridge dictionary. Retrieved from http://dictionary.cambridge.org/us/dictionary/english/driver Google Scholar
Church, A. H., & Oliver, D. H. (2006). The importance of taking action, not just sharing survey feedback. In Kraut, A. I. (Ed.), Getting action from organizational surveys (pp. 102130). San Francisco, CA: Jossey-Bass.Google Scholar
Church, A. H., & Waclawski, J. (1998). Designing and using organizational surveys: A seven-step process. San Francisco, CA: Jossey-Bass.Google Scholar
Cohen, J. (1992). A power primer. Psychological Bulletin, 112 (1), 155159.Google Scholar
Cohen, J., & Cohen, P. (1983). Applied multiple regression/correlation analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.Google Scholar
Colquitt, A. L., & Macey, W. H. (2005, April). Surveys throughout the employment lifecycle: What matters, when. Preconference workshop presented at the 20th Annual Conference of the Society for Industrial and Organizational Psychology, Los Angeles, CA.Google Scholar
Cook, T. D., & Campbell, D. T. (1979). Quasi experimentation: Design and analytical issues for field settings. Chicago, IL: Rand McNally.Google Scholar
Cucina, J. M., Credé, M., Curtin, P. J., Walmsley, P. T., & Martin, N. R. (2014, May). Large sample evidence of a general factor in employee surveys. Poster presented at the 29th Annual Conference of the Society for Industrial and Organizational Psychology, Honolulu, HI.Google Scholar
Cureton, E. E. (1950). Validity, reliability, and baloney. Educational and Psychological Measurement, 10, 9496.Google Scholar
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis. Cambridge, MA: MIT Press.Google Scholar
Gast, I. F. (2008, November). Developing and administering web-based surveys: A tool-kit for assessment professionals. New York, NY: Mid-Atlantic Personnel Assessment Conference.Google Scholar
Gast, I. F. (2010, July). Questions you should answer before conducting a survey. Newport Beach, CA: International Personnel Assessment Council Conference.Google Scholar
Johnson, J. W. (2000). A heuristic method for estimating the relative weight of predictor variables in multiple regression. Multivariate Behavioral Research, 35 (1), 119.Google Scholar
Johnson, J. W., & LeBreton, J. M. (2004). History and use of relative importance indices in organizational research. Organizational Research Methods, 7, 238257.Google Scholar
Kraut, A. I. (2006). Moving the needle: Getting action after a survey. In Kraut, A. I. (Ed.), Getting action from organizational surveys (pp. 132). San Francisco, CA: Jossey-Bass.Google Scholar
Leonard, D., & Lieberman, M. (2006). Data use: Using key driver analysis to guide employee satisfaction research. Quirk's Marketing Research Review, 20 (9), 2024.Google Scholar
Lundby, K. M., & Johnson, J. W. (2006). Relative weights of predictors: What is important when many forces are operating. In Kraut, A. I. (Ed.), Getting action from organizational surveys (pp. 326351). San Francisco, CA: Jossey-Bass.Google Scholar
Macey, W. H., & Schneider, B. (2008). The meaning of employee engagement. Industrial and Organizational Psychology: Perspectives on Science and Practice, 1 (1), 330.Google Scholar
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd ed.). New York, NY: McGraw Hill.Google Scholar
Rasch, R. L., Kowske, B., & Herman, A. E. (2009, April). Identifying key drivers and actionable items: Comparison of four methods. Paper presented at the 24th Annual Conference of the Society for Industrial and Organizational Psychology, New Orleans, LA.Google Scholar
Robie, C., Brown, D. J., & Beaty, J. C. (2007). Do people fake on personality inventories? A verbal protocol analysis. Journal of Business and Psychology, 21 (4), 489509.Google Scholar
Rodriguez, D., Usala, P., & Shoun, S. (1995). Occupational analysis of federal clerical and technical occupations: An application of the multipurpose occupational systems analysis inventory—Closed-ended (MOSAIC; Report No. PRDC-95-08). Washington, DC: U.S. Office of Personnel Management, Personnel Resources and Development Center.Google Scholar
Scherbaum, C. A., Putka, D. J., Naidoo, L. J., & Youssefnia, D. (2010). Key driver analyses: Current trends, problems, and alternative approaches. In Albrecht, S. L. (Ed.), Handbook of employee engagement: Perspectives, issues, research, and practice (pp. 182196). Northampton, MA: Edward Elgar.Google Scholar
Schmidt, F. L., & Hunter, J. E. (1977). Development of a general solution to the problem of validity generalization. Journal of Applied Psychology, 62, 529540.Google Scholar
Society for Human Resource Management. (2012). 2012 employee job satisfaction and engagement: How employees are dealing with uncertainty. Alexandria, VA: Author.Google Scholar
Trickett, S. B., & Trafton, J. G. (2009). A primer on verbal protocol analysis. In Schmorrow, D., Cohn, J., & Nicholson, D. (Eds.), The PSI handbook of virtual environments for training and education (pp. 332346). Westport, CT: Praeger.Google Scholar
U.S. Department of Labor. (1974). Descriptive rating scale (Form MA-7-66, Rev. 3-74). Washington, DC: U.S. Department of Labor, Manpower Administration.Google Scholar
Wiley, J. W. (1996). Linking survey results to customer satisfaction and business performance. In Kraut, A. I. (Ed.), Organizational surveys: Tools for assessment and change (pp. 330359). San Francisco, CA: Jossey-Bass.Google Scholar
Wiley, J. W. (2010). Strategic employee surveys. San Francisco, CA: Jossey-Bass.Google Scholar
Wiley, J. W., & Campbell, B. H. (2006). Using linkage research to drive high performance: A case study in organizational development. In Kraut, A. I. (Ed.), Getting action from organizational surveys: New concepts, technologies, and applications (pp. 150182). San Francisco, CA: Jossey-Bass.Google Scholar
Wirth, R. J., & Edwards, M. C. (2007). Item factor analysis: Current approaches and future directions. Psychological Methods, 12 (1), 5879.CrossRefGoogle ScholarPubMed