Hostname: page-component-586b7cd67f-vdxz6 Total loading time: 0 Render date: 2024-11-23T19:08:53.814Z Has data issue: false hasContentIssue false

Strategic responses to anti-DEI legislation: The promise of culturally responsive assessments

Published online by Cambridge University Press:  19 November 2024

Emily Gallegos*
Affiliation:
Department of Psychology, University of Texas at Arlington, Arlington, USA
Katrisha M. Smith
Affiliation:
Department of Psychology, University of Texas at Arlington, Arlington, USA
Juveria Syed
Affiliation:
Department of Psychology, University of Texas at Arlington, Arlington, USA
Ricardo R. Brooks
Affiliation:
Department of Psychology, Pennsylvania State University, University Park, USA
Michelle P. Martín-Raugh
Affiliation:
Department of Psychology, University of Texas at Arlington, Arlington, USA
*
Corresponding author: Emily Gallegos; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Type
Commentaries
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of Society for Industrial and Organizational Psychology

Follmer et al. (Reference Follmer, Sabat, K. and King2024) highlight the negative consequences of dismantling diversity, equity, and inclusion (DEI) policies and practices, including reduced employee preparedness for diversity-related issues, diminished organizational effectiveness, and a decline in the quality of employee selection processes. Across education and employment contexts, research indicates that traditional assessments can be biased (Leong et al., Reference Leong, Roohr, Ramanarayanan, Martin-Raugh, Kell, Ubale and McCulla2019; Pennock-Román, Reference Pennock-Román and Gifford1993). This bias results in increased barriers to career opportunities and ongoing organizational inequities, such as hiring and staffing discrimination, stemming from applicant evaluation bias against historically marginalized groups (Hardy et al., Reference Hardy, Tey, Cyrus-Lai, Martell, Olstad and Uhlmann2022). The authors propose strategies for supporting DEI efforts despite legal challenges and urge industrial and organizational psychologists to advocate for continued investment in DEI programs through research and practice. Building on their call to action, we argue for incorporating culturally responsive assessment (CRA) across the human resources (HR) lifecycles to counteract anti-DEI legislation effectively. CRA is a relatively new assessment design approach gaining traction among educational scholars (Montenegro & Jankowski, Reference Montenegro and Jankowski2017; Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023) and has yet to receive much attention from organizational researchers. The goal of CRA is to integrate test takers’ diverse cultural backgrounds into assessment content, format, or scoring, enhancing the assessment’s validity and equity for test takers (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). Thus, we recommend CRAs as a method to counteract the negative outcomes associated with anti-DEI legislation.

Within the HR lifecycle, employee selection and assessment processes may be particularly susceptible to the implications of anti-DEI legislation. Traditional methods for addressing group differences in assessment scores, such as differential item functioning or equating analyses, handle biases by adjusting item-level and scoring data after an assessment is administered. However, these methods do not address the content of assessments to reduce bias from the outset. Moreover, these methods require demographic data collection, which anti-DEI efforts seek to prevent. Alternatively, CRAs address inequity and bias during assessment development rather than postadministration. Some key features of CRAs include considering the diverse groups served, using inclusive language for intersecting identities, taking test-taker differences into account during planning, and employing suitable assessment tools and methodologies (Montenegro & Jankowski, Reference Montenegro and Jankowski2017). Therefore, CRAs should, in theory, make the assessment experience feel inclusive and fair for test takers (Bennett, Reference Bennett2023), an advantage not provided by statistical adjustments.

Consider the development of a teacher licensure assessment as an illustrative example of applying CRA strategies. First, test developers can gather insights from diverse stakeholders, including teachers, students, parents, and administrators. Second, surveys can then enhance the diversity of perspectives in assessment content, drawing from varied professional viewpoints across multiple locations. Third, oversampling individuals from underrepresented identities can capture their perspectives (Martín-Raugh et al., Reference Martín-Raugh, Reese, Tannenbaum, Steinberg and Xu2016). Fourth, virtual cognitive interviews with underrepresented groups can highlight cultural differences in item interpretation, improving assessment accuracy (Zucker et al., Reference Zucker, Sassman and Case2004). Finally, findings from these methods can inform adjustments to align assessments with intended constructs and address biases (Kūkea Shultz et al., Reference Kūkea Shultz, Englert, Krug, Ruth, Ching and Franco2019). Incorporating CRA into various stages of the HR lifecycle—such as job analysis, selection, training, assessment, and performance appraisal—can support DEI values and initiatives. Below, we illustrate the potential benefits of CRA strategies by highlighting improvements in inclusiveness and representation, mitigation of biases, and potential enhancement of the validity of assessment scores.

Improving inclusiveness and representation

A primary benefit of CRAs is their ability to enhance inclusiveness and representation through the principle of shared power, integrating diverse stakeholders across levels to ensure inclusivity and acknowledgment, and demonstrating value for all voices and perspectives. Shared power is crucial for achieving equity for marginalized groups and facilitates the development of assessment frameworks that enable participants to effectively demonstrate their abilities (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). By forming partnerships where stakeholders collaborate as equals in defining standards, designing assessments, and monitoring and adapting assessments based on observed outcomes and applications, CRAs have the potential to contribute to more equitable and responsive assessment practices (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). HR professionals could integrate CRA principles, such as shared power, into the job analysis process to promote inclusivity and representation to uphold DEI principles.

One approach job analysts might consider during the initial stages of data collection includes actively seeking input from a diverse group of incumbents, subject matter experts (SMEs), and other stakeholders (e.g., customers, subordinates) to ensure that the job analysis data benefit from a broad range of perspectives and insights. Analysts could consider various aspects of incumbents’ identities, interests, preferences, education, and needs to leverage participants’ unique strengths effectively (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). A second approach includes incumbents and SMEs specifying contextual factors associated with the unique experiences of historically marginalized populations (Dalal et al., Reference Dalal, Randall, Danna and Ash2023), as the specific knowledge, skills, abilities, and other characteristics (KSAOs) that influence job performance (e.g., contact tracing in underrepresented populations) may shift across populations. This approach ensures that the KSAOs specified for successful performance are comprehensive across contexts and populations. However, no matter the approach taken, it is essential to account for subgroup differences, as environmental and societal factors can influence how different demographic groups experience their positions or roles (Strah & Rupp, Reference Strah and Rupp2022).

When gathering meaningful and relevant work-related information, job analysts could carefully consider the context and methods of data collection to effectively meet the diverse needs of people and encourage full engagement. Customizing data collection methods based on the distinct characteristics and abilities of incumbents and SMEs and utilizing various approaches such as interviews, questionnaires, focus groups, SME workshops, and observation is important for fostering flexibility in participation. Flexibility and accommodations for stakeholders’ participation can foster the involvement of underrepresented voices and lead to more diverse and inclusive representation (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023) in the job analysis process.

Furthermore, Strah and Rupp (Reference Strah and Rupp2022) emphasize that group-based settings for gathering job analysis data might impede underrepresented groups from expressing their opinions effectively or diminish their willingness to share information openly. Thus, enabling stakeholders to participate in various ways could foster open and honest constructive feedback. Responsive job analysis practices have the potential to align performance evaluations with diverse backgrounds, adjust metrics, address counterproductive behaviors, and attract a more diverse candidate pool through inclusive job postings that may ultimately advance organizational inclusivity and representation goals.

Mitigating biases

A second benefit of CRAs is their ability to mitigate biases in employee selection by designing and administering assessments that consider cultural factors such as language, values, beliefs, norms, and differing abilities. To achieve this, insights can be gleaned from North American higher education CRA literature, incorporating key cross-cultural assessment characteristics: sensitivity to diverse perspectives, integration of multiple viewpoints, and evaluation of cultural competence. These practices could effectively address performance disparities across organizational settings, fostering inclusivity (Mortaz Hejri et al., Reference Mortaz Hejri, Ivan and Jama2022).

The scope of CRAs extends beyond race, gender, and culture to promote inclusivity for individuals with disabilities. The 1997 Individuals with Disabilities Education Act marked a pivotal moment in alternative assessment construct by advocating for college and career readiness among students with disabilities. This laid the groundwork for strategic efforts such as the 2001 No Child Left Behind Act and the Race to the Top initiative, which worked to broaden disability rights and advance educational CRA principles on a state and federal level. Academic application of CRAs is backed by some preliminary empirical evidence (Brown et al., Reference Brown, Burns, McNamara and O’Hara2022; Sinharay & Johnson, Reference Sinharay and Johnson2023) that could be translated to organizational settings. By adopting similar inclusive assessment strategies, organizations can enhance their hiring processes, ensure fair treatment of individuals with disabilities, and promote diversity and inclusion within the workplace.

Organizations can invest in inclusive assessment formats that empower individuals by offering external control, ensuring stakeholder involvement, and fostering opportunities for choice and partnership (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). Research indicates that involving historically marginalized groups in the development process challenges stereotypes and benefits broader audiences with culturally valid information (Kūkea Shultz et al., Reference Kūkea Shultz, Englert, Krug, Ruth, Ching and Franco2019). Moreover, incorporating diverse representation among SMEs mitigates biases and enhances confidence in the reliability of information. However, bias exists beyond SMEs and significantly affects the selection process. Cognitive ability tests are notoriously problematic for producing group differences and would benefit from CRA inclusion as they vary widely and require creative problem-solving (Moscoso, Reference Moscoso2003). CRAs can enhance these tests by leveraging candidates’ strengths and encouraging fairness where subjective judgments may arise.

Enhancing the validity of assessment scores

A third benefit of CRAs is their potential to enhance the validity of assessments by ensuring that the measures accurately reflect the diverse abilities and knowledge of all individuals. This is crucial because assessment scores inform various personnel selection and decision-making processes. Implementing the principles of flexibility and engagement from Walker et al.’s (Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023) CRA framework could benefit internal selection systems such as performance appraisal and learning management systems (LMS).

In the context of performance appraisal, the flexibility principle involves assessment design elements that account for differences in culture, interest, and identity, allowing for a representative conceptualization of performance. The engagement principle encourages active participation and a sense of belonging through content-rich assessments, giving employees control over their evaluations to reflect their strengths and contributions (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). For example, a project manager valuing team leadership and innovation could choose to be evaluated on these dimensions, presenting a recent project to illustrate her skills. This approach ensures the appraisal process accurately reflects the employee’s strengths. As a result, the project manager feels more engaged and confident in her evaluations.

Regarding LMS, flexible and engaging training options allow employees to tailor their learning experiences to align with their individual backgrounds and experiences, providing opportunities for low-income and marginalized populations (Long, Reference Long2009). For instance, offering online courses and evening workshops enables single working mothers to participate in training sessions after work hours, balancing their work and parental duties. Integrating CRAs into performance appraisals and training potentially enhances validity through cultural representation. However, this approach risks introducing culture-specific content that may not apply to all subgroups, resulting in construct irrelevant variance.

Conclusion

In recent years, legislative initiatives have established a precedent for the dismantling of DEI initiatives across organizational settings that push an identity-blind perspective to diversity. This precedent has been associated with negative outcomes such as reduced support for DEI initiatives, decreased minority employee workplace engagement, increased prejudice, and a reduction of inclusive perspectives and behaviors (Follmer et al.; Yi et al., Reference Yi, Neville, Todd and Mekawi2022). Moreover, these legislative initiatives do not reflect public opinion, which supports various DEI initiatives, including but not limited to businesses supporting DEI efforts and initiatives (Gallup, 2022). Building on Follmer et al.’s call to continue advocacy for investment in DEI programs, we suggest incorporating CRA across the HR lifecycles to improve inclusiveness and representation, mitigate biases, and potentially enhance the validity of assessment scores.

First, we recommend using diverse SMEs and various data-gathering methods via job analysis to enhance cultural responsiveness in selection procedures. For example, in highly heterogeneous work environments, one-on-one interviews may help all employees share their perspectives, thus establishing equality and producing culturally competent job-related data (Strah & Rupp, Reference Strah and Rupp2022). Second, we suggest incorporating culturally responsive design elements into assessment tools to mitigate bias and increase inclusivity. Assessment designers should consider varying languages, values, beliefs, norms, and differing abilities for more comprehensive assessments. These practices also address potential performance disparities across organizational settings (Mortaz Hejri et al., Reference Mortaz Hejri, Ivan and Jama2022). Finally, we advocate for integrating the CRA framework principles of flexibility and engagement (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023) across internal selection systems such as performance appraisal systems and LMS. Using design elements that offer employees choices in appraisal criteria and learning content formats increases perceived cultural representation and engagement from diverse employees (Long, Reference Long2009).

Although CRA practices have great potential for promoting positive education, workplace, and broader societal outcomes by fostering inclusivity and resulting in other positive benefits for both organizations and test takers, empirical evidence supporting CRA is thus far lacking (Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023). However, given more recent calls for more personalized assessment content, formats, and scoring approaches (e.g., Bennett, Reference Bennett2023; Walker et al., Reference Walker, Olivera-Aguilar, Lehman, Laitusis, Guzman-Orth and Gholson2023), one viable way of addressing Follmer et al.’s call to action is through the implementation of CRA into organizational practices, procedures, and much-needed empirical research to support its potential efficacy.

Competing interests

We have no known conflict of interest to disclose.

References

Bennett, R. E. (2023). Toward a theory of socioculturally responsive assessment. Educational Assessment, 28(2), 83104.CrossRefGoogle Scholar
Brown, M., Burns, D., McNamara, G., & O’Hara, J. (2022). Culturally responsive classroom-based assessment: A case study of secondary schools in Ireland. Revista de Investigación Educativa, 40(1), 1532. https://doi.org/10.6018/rie.496681 CrossRefGoogle Scholar
Dalal, D. K., Randall, J., Danna, G. C., & Ash, J. (2023). Identifying critical psychological characteristics related to successful performance as a contact tracer: A job analysis. Personnel Assessment and Decisions, 9(1), 6381.CrossRefGoogle Scholar
Follmer, K. B., Sabat, I. Kristen P. Jones, K., P., & King, E. (2024). Under attack: Why and how I-O psychologists should counteract threats to DEI in education and organizations. Industrial and Organizational Psychology, 17.Google Scholar
Hardy, J. H. III, Tey, K. S., Cyrus-Lai, W., Martell, R. F., Olstad, A., & Uhlmann, E. L. (2022). Bias in context: Small biases in hiring evaluations have big consequences. Journal of Management, 48(3), 657692. https://doi.org/10.1177/0149206320982654 CrossRefGoogle Scholar
Kūkea Shultz, P., Englert, K., Krug, K., Ruth, K., Ching, L., & Franco, L. (2019). Context matters: The promise of cultural and community validity in assessment [Conference session]. Third Annual NCME Special Conference on Classroom Assessment, Boulder CO, United States. https://drive.google.com/file/d/1Q0JXb7cYwnCuItoY9961yp-F3jFylwWT/view Google Scholar
Leong, C. W., Roohr, K., Ramanarayanan, V., Martin-Raugh, M. P., Kell, H., Ubale, R., & McCulla, L. (2019). To trust, or not to trust? A study of human bias in automated video interview assessments. https://doi.org/10.48550/arXiv.1911.13248,Google ScholarPubMed
Long, D. A. (2009, November 7). The promise of customized training: Evidence from the United States. [Paper presentation]. Employment, Social Affairs and Equal Opportunity and the University of Maryland School of Public Policy, Washington, DC.Google Scholar
Martín-Raugh, M. P., Reese, C. M., Tannenbaum, R. J., Steinberg, J. H., & Xu, J. (2016). Investigating the relevance and importance of high-leverage practices for beginning elementary school teachers (Research Memorandum No. RM-16-11). Educational Testing Service.Google Scholar
Montenegro, E., & Jankowski, N. A. (2017, January). Equity and assessment: Moving towards culturally responsive assessment. (Occasional Paper No. 29). University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). https://learningoutcomesassessment.org/wp-content/uploads/2019/02/OccasionalPaper29.pdf Google Scholar
Mortaz Hejri, S., Ivan, R., & Jama, N. (2022). Assessment through a cross-cultural lens in North American higher education. Frontiers in Education, 7, 16. https://doi.org/10.3389/feduc.2022.1012722,CrossRefGoogle Scholar
Moscoso, S. (2003). The within-setting variability of validity in cognitive ability tests. International Journal of Selection & Assessment, 11(4), 352355, https://doi.org/10.1111/j.0965-075X.2003.00258.xCrossRefGoogle Scholar
Pennock-Román, M. (1993). The status of research on the scholastic aptitude test (SAT) and Hispanic students in postsecondary education. In Gifford, R. B. (Eds.), Policy perspectives on educational testing (pp. 75115). Springer Netherlands.CrossRefGoogle Scholar
Sinharay, S., & Johnson, M. S. (2024). Computation and accuracy evaluation of comparable scores on culturally responsive assessments. Journal of Educational Measurement, 61(1), 546.CrossRefGoogle Scholar
Strah, N., & Rupp, D. E. (2022). Are there cracks in our foundation? An integrative review of diversity issues in job analysis. Journal of Applied Psychology, 107(7), 10311051.CrossRefGoogle ScholarPubMed
Walker, M. E., Olivera-Aguilar, M., Lehman, B., Laitusis, C., Guzman-Orth, D., & Gholson, M. (2023). Culturally responsive assessment: Provisional principles. ETS Research Report Series, 2023(1), 124. https://doi.org/10.1002/ets2.12374CrossRefGoogle Scholar
Yi, J., Neville, H. A., Todd, N. R., & Mekawi, Y. (2022). Ignoring race and denying racism: A meta-analysis of the associations between colorblind racial ideology, anti-Blackness, and other variables antithetical to racial justice. Journal of Counseling Psychology, 70(3), 258275.CrossRefGoogle ScholarPubMed
Zucker, S., Sassman, C., & Case, B. J. (2004). Cognitive labs. Hartcourt Assessment. https://www.academia.edu/download/90433840/CognitiveLabs.pdf Google Scholar