Wilcox etal. (Reference Wilcox, Damarin and McDonald2022) indicate that cybervetting may be biased and discriminatory and opine that “greater federal guidance is needed to clarify the conditions under which cybervetting is an acceptable tool” (p. XX) and that “professional associations could publish standards that clarify whether cybervetting is permissible when hiring individuals who engage in their specific type of work” (p. XX). Although additional guidance specific to cybervetting may be desired, Wilcox etal. did not sufficiently discuss the abundance of general guidance that already exists related to the use of any selection procedure, including cybervetting. Iwill summarize existing guidance and explain the legal risk (and folly) of using cybervetting from a practitioner perspective. Ithen argue that cybervetting is a symptom of a deeper problem.
Broad federal laws prohibit discrimination related to several protected classes including race, color, national origin, sex, and religion (under Title VII of the Civil Rights Act of 1964); age (under the Age Discrimination in Employment Act of 1967); and disability (under the Americans with Disabilities Act of 1990). These employment discrimination laws apply to all selection procedures that are used for making selection decisions. Therefore, if cybervetting is used to make selection decisions (such as who does or does not advance to the next step in the selection process, who is or is not extended a job offer, etc.), then it is subject to these employment discrimination laws. Furthermore, there are additional protected classes (e.g., weight, height, political affiliation, marital status, etc.) covered by federal, state, and local laws that apply to a narrower scope of organizations.
Cybervetting is a risky selection procedure because cues related to almost any protected class can often be discovered via basic online snooping. If an applicant is eliminated because of their protected class, then the employer could be liable for disparate treatment discrimination (intentional discrimination). If cybervetting disproportionately favors one subgroup and disfavors another (e.g., favors men over women or Caucasians over Hispanics), then the employer could be liable for disparate impact discrimination (unintentional discrimination) regardless of its intent.
Disparate impact discrimination complaints proceed through three phases:
Phase 1: The plaintiff has the burden of showing adverse impact (i.e., a substantially different rate of selection that works to the disadvantage of members of a protected class).
Phase 2: The defendant has the burden of showing that the procedure having adverse impact is job related (or valid) for the position in question and consistent with business necessity. If the plaintiff meets the burden in Phase 1 but the defendant fails to meet the burden in Phase 2, then the plaintiff’s disparate impact discrimination claim is supported.
Phase 3: If the defendant meets its burden as outlined in Phase 2, the plaintiff can still prevail by showing that an alternative measure (or employment practice) with substantially equal validity but less adverse impact was available.
Regarding Phase 1, there is still a lack of empirical evidence directly related to the extent to which cybervetting disproportionately discriminates against protected classes; this is a topic ripe for future research. However, Wilcox etal. (Reference Wilcox, Damarin and McDonald2022) provide a compelling argument for why adverse impact would be expected with the typical unreliable, casual, unsystematic, “fuzzy” cybervetting selection process. Thus, it would be prudent for organizations to assume that, if the cybervetting selection procedure is challenged, plaintiffs would prevail in showing adverse impact. Therefore, in anticipation of the potential need for a Phase 2 defense, organizations should ensure that the use of cybervetting is valid and job related.
Regarding Phase 2 and Wilcox etal.’s (Reference Wilcox, Damarin and McDonald2022) call for more federal and professional association guidance, there are many guidelines already published by federal agencies and professional associations related to validation. For example, in 1978 the Uniform Guidelines on Employee Selection Procedures were authored jointly by the Equal Employment Opportunity Commission, Civil Service Commission, Department of Labor, and Department of Justice. Similarly, in 1979 the Uniform Employee Selection Guidelines Interpretation and Clarification were authored jointly by the Equal Employment Opportunity Commission, Office of Personnel Management, Department of Justice, Department of Labor, and Department of Treasury. In 1999, the U.S. Department of Labor also published a helpful guide titled Testing and Assessment: An Employer’s Guide to Good Practices. Our professional association, SIOP, first published the Principles for the Validation and Use of Personnel Selection Procedures in 1980, and it is now in its fifth edition (2018). The Standards for Educational and Psychological Testing were authored jointly by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education; first published in 1966, the Standards are now in their fifth edition (2014). Indeed, an abundance of guidance has been in place for more than 4 decades.
Given the expected adverse impact, it is surprising that hiring agents rarely make any legitimate attempt to consider the validity or job relatedness of cybervetting as prescribed by existing federal and professional association guidance. As explained by Wilcox etal. (Reference Wilcox, Damarin and McDonald2022), at best, hiring agents are using cybervetting to manage risk. More likely, cybervetting is used by hiring agents to execute similar-to-me biases under the guise of fit. Instead of appropriately considering person–job or person–organization fit, hiring agents eliminate candidates due to a lack of person–me fit. Indeed, an organization will have a very difficult time defending the job relatedness of cybervetting, especially as cybervetting is further removed from the job and moves further into the personal and private lives of the applicants being vetted. This begs the question: Why do hiring agents fail to follow the extant federal and professional association guidance?
Regarding Phase 3, even if a defendant somehow defends the job relatedness of cybervetting (in Phase 2), plaintiffs could still prevail by identifying (a) alternative selection procedures or (b) alternative methods of administering or implementing the cybervetting selection procedure. Given that cybervetting is typically used to manage risk (rather than as a bona fide method of predicting job performance) its purpose is similar to that of other selection procedures that are used as part of a background check. Alternatives such as reference checks, criminal history checks, credit checks, and drug testing might be used instead. Although these alternatives are subject to the same guidelines and standards and must be used appropriately, they are more established, formal, and standardized compared with the typical cybervetting process and they would be expected to be more reliable and less biased. Ironically, as employers reduce their reliance on credit and criminal history checks (mainly due to legal and social justice concerns), hiring agents are seemingly increasing their reliance on cybervetting.
In addition to background checks, applicant dependability and reliability could be assessed via a measure of conscientiousness, which is a valid predictor of job performance (Schmidt & Hunter, Reference Schmidt and Hunter1998) and would be expected to result in little to no adverse impact (Hough etal., Reference Hough, Oswald and Ployhart2001). Furthermore, integrity tests are good predictors of counterproductive work behaviors (e.g., theft, illegal activities, absenteeism, tardiness, drug abuse, dismissals for theft, and violence on the job) and job performance (Ones etal., Reference Ones, Viswesvaran and Schmidt1993) and would be expected to result in little to no adverse impact (Ones & Viswesvaran, Reference Ones and Viswesvaran1998).
Plaintiffs could also prevail by identifying more structured, standardized, and reliable methods of administering the selection procedure. Therefore, if organizations insist on using cybervetting as part of the selection process, it behooves them to make the process as fair and reliable as possible. Although Ido not advocate the widespread use of cybervetting, employers who insist on using cybervetting may be able to salvage it by following several best practices. Some of the best guidance Iam aware of is offered by Heneman etal. (Reference Heneman, Judge and Kammeyer-Mueller2019, pp. 411–412). The following list paraphrases and expands upon their suggestions.
1. Make sure you are following the law. Although no federal laws explicitly address or ban cybervetting, the aforementioned laws do provide guidance that applies to all selection methods, including cybervetting. In addition, laws at the state and local levels related to background checks and applicant privacy are evolving.
2. Assuming cybervetting is permissible by law, the organization should formally decide, through policy, whether or not it is permissible for employees to conduct such searches and to use information obtained through these searches. The hiring agent is an agent of the organization, and the organization is the entity that is liable for discrimination, so organizations should establish and enforce policy related to cybervetting.
If the organization, by policy, allows cybervetting or similar searches, all of the following practices should be put in place.
3. Obtain the applicant’s consent or at the least inform them that such searches will be performed as part of the selection process.
4. There is no one-size-fits-all selection procedure. Decide which jobs you will use it for and justify (based on job relatedness and business necessity) why it will be used for these jobs. Do not hold applicants to cybervetting standards to which you do not hold existing employees.
5. Only gather job-related information. Ignore and disregard any information that is not job related.
6. Focus only on primary information posted by the applicant, not secondary information posted by others about the applicant. Secondary information is hearsay from people who are not applicants, and it should be disregarded.
7. Perform cybervetting late in the selection process as a formal part of a background check; consider using it as a contingent decision method implemented after an offer is made. Also, only cybervet finalists for the job. The vast majority of information found through cybervetting will not be job related, yet the information cannot be unseen, and it may influence your impression of the candidate and lead to bias. Therefore, if it is used, limit its use to finalists who have already been determined to be fully qualified based on relevant job-related factors.
8. Be consistent and standardized. This is critical to every selection procedure. Follow the same cybervetting process and gather the same information at the same stage in the selection process for all candidates or finalists.
9. Have a trained and dedicated staff member, such as a human resources (HR) professional, perform all cybervetting activities including the collecting and evaluating of information. Given the sensitive nature of the information and the fact that the information cannot be unseen, it is best to avoid (and prohibit) an informal or decentralized approach.
10. Document selection decisions that are made based on cybervetting, especially if someone is rejected based on information discovered in the search. There needs to be a job-related or business-necessity reason for gathering and using the information. Transparency fosters accountability and reduces the likelihood of hiring agents making decisions based on fuzzy similar-to-me “fit” that is likely related to protected classes.
11. If a candidate is rejected, inform them and provide them with an opportunity to explain or defend themselves. This reduces the likelihood of making bad selection decisions as a result of mistaken identity or fake, false, or irrelevant information.
The deeper problem: Incessant use of haphazard selection procedures
Although Wilcox etal. (Reference Wilcox, Damarin and McDonald2022) provide a thought-provoking discussion of cybervetting and the flaws of using such a selection procedure, it is important to understand that the use of cybervetting is the latest symptom of an age-old problem: the rampant misuse of selection procedures (or rampant use of ineffective selection procedures). Thus far, Ihave discussed methods for treating the cybervetting symptoms in an effort to avoid suboptimal selection decisions and discrimination and to minimize an organization’s legal risk. However, more research and policy related to cybervetting will not fix the deeper problem of hiring agents’ incessant use of haphazard selection procedures.
Most employment selection practices that are used by organizations can be categorized into one of three general selection strategies: valid (or job-related) selection, random selection, or haphazard selection. The intent of valid selection is to make selection decisions using reliable and valid selection procedures. Researchers and practitioners have been developing and improving procedures for optimizing this strategy for several decades, and this is the strategy advocated by some industrial-organizational (I-O) and human resources management (HRM) practitioners who are aware that such procedures exist. However, it is important to recognize that many hiring agents (including I-O psychologists, HRM professionals, third-party recruiters, and hiring managers) who do not specialize in employment selection may lack awareness of these selection procedures or, even if they are aware, they may not have the technical expertise to properly implement this strategy. The deeper problem of which cybervetting is only a symptom is that hiring agents seem largely unable or unwilling to use valid selection procedures.
Instead, many organizations, or agents thereof, manufacture selection procedures that can be collectively referred to as haphazard selection. The selection procedures might be developed or chosen based on hunches, intuition, anecdotal experience, the latest trends, internet searches, stereotypes, or “face validity” (i.e., they appear to be valid to the layperson but are not necessarily valid). Rosse and Levin (Reference Rosse and Levin1997) describe these selection practices quite extensively, referring to them as: “hiring by luck, chance, intuition, or gut feelings” (p. 1); crapshoot, unsystematic, low-impact, warm-body, or ritual hiring (pp. 1, 6, 9, 11); the blind faith, seat-of-the-pants, or trial-and-error approach (pp. 11–12); or relying on pet methods or relying on methods simply because others are using them (p. 12).
In a 2008 issue of Industrial and Organizational Psychology, Highhouse condemned the stubborn reliance on intuition and subjectivity in employee selection (the title of his focal article). He cites research showing that
HR professionals agreed, by a factor of more than 3 to 1, that using tests was an effective way to evaluate a candidate’s suitability and that tests that assess specific traits are effective for hiring employees. At the same time, however, these same professionals agreed, by more than 3 to 1, that you can learn more from an informal discussion with job candidates and that you can “read between the lines” to detect whether someone is suitable to hire. (pp. 333–334)
Highhouse also cites research indicating that HR executives “considered the traditional unstructured interview more effective than any of the paper-and-pencil assessment procedures” (p. 333) despite research showing that all of the other procedures outperform the unstructured interview. Highhouse argues that “there is considerable evidence that employers simply do not believe that the research is relevant to their own situation” (p. 333). Highhouse further explains that
most people believe in the myth of selection expertise. By this Imean the belief that one can become skilled in making intuitive judgments about a candidate’s likelihood of success. This is reflected in the survey responses of the HR professionals who believed in “reading between the lines” to size up job candidates…. Despite this widespread belief in intuitive expertise, the data suggest that it is a myth. (p. 337)
People trust that the complex characteristics of applicants can be best assessed by a sensitive, equally complex human being. This does not stand up to scientific scrutiny. (p. 340)
Jacobs etal. (Reference Jacobs, Thoroughgood, Sawyer and Kitaeff2011) discuss a similar assumption or myth in the context of evaluating performance:
Many systems carry with them an implicit assumption when rater training is minimal or nonexistent (as is often the case) that since a supervisor … was once a [subordinate], he or she knows about performance and can appropriately observe and rate the performance of others. Everything we have discussed up to this point shows that, in many cases, this assumption is simply not true. (p. 174)
We make a faulty assumption when it comes to performance management. We often assume that because someone has become a supervisor, they can and will be competent in the evaluation of others. (p. 184)
Fourteen years after Highhouse’s (Reference Highhouse2008) stirring article, we are discussing, in the same venue, a new symptom (cybervetting) of an old problem (the stubborn reliance on haphazard selection). Although alleviating symptoms is important, researchers and practitioners may gain more in the long run by focusing more attention on solving the deeper problem. How will we make more meaningful progress in getting organizations and hiring agents to cease the use of haphazard selection procedures and to increase the use of valid/job-related selection procedures?