EEOC's Guidance on Artificial Intelligence: Hiring and Employment-related Actions Taken using Artificial Intelligence may be Investigated for Employment Discrimination Violations

FordHarrison
Contact

FordHarrison

Executive Summary: The Equal Employment Opportunity Commission (EEOC) recently published a technical assistance document providing guidance on when the use of artificial intelligence or algorithms in employee selection procedures may have a disparate impact in violation of Title VII.

Artificial Intelligence in the Workplace

Artificial intelligence (“AI”), which includes the use of software algorithms, is a valuable application for a growing list of business processes, including decisions pertaining to hiring, monitoring performance, and determining pay or promotions in the workplace. In the employment context, this may include resume scanners that prioritize job applications using certain keywords, employee monitoring software that rates employees on their use of keystrokes, virtual assistants that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements, and testing software that provides “job fit” scores for applicants or employees regarding their aptitudes, cognitive skills or perceived “cultural fit” based on their performance on a game or a test.

The Use of AI in the Selection Process and Application of the Four-Fifths Rule

Automated processes used for hiring, monitoring performance, or determining pay or promotion in the workplace must comply with Title VII. Recently, the Equal Employment Opportunity Commission (EEOC) issued a technical assistance document entitled Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures under Title VII of the Civil Rights act of 1964. This document is intended to provide guidance on algorithmic fairness and the use of artificial intelligence in employment selection decisions. The Technical Assistance document is part of the EEOC’s Initiative announced in 2021 that the agency is interested in enforcing employment law violations that may result from the use of AI in the workplace.

The scope of the EEOC’s Technical Assistance document is focused on “disparate impact” situations. In addition to prohibiting intentional discrimination, Title VII generally prohibits employers from using neutral tests or selection procedures that have the effect of disproportionately excluding people based on race, color, religion, sex, or national origin, if the tests or selection procedures are not “job related for the position in question and consistent with business necessity” (i.e. disparate impact discrimination). The Technical Assistance document focuses on evaluating whether a test has a disproportionate impact on a protected group, not whether the procedures are job related and consistent with business necessity. The document states that the 1978 Uniform Guidelines on Employee Selection Procedures apply to algorithmic decision-making tools used to make or inform decisions about hiring, promotion, termination and similar actions.

The EEOC has developed the “Four-Fifths Rule,” which it describes as a “rule of thumb” to help determine whether a selection process results in a substantially different rate between two protected groups and has stated that this rule applies to selection processes using AI technology. The rule states that one rate is substantially different than another if their ratio is less than four-fifths (or 80%). The application of the Four-Fifths rule is explained in the EEOC’s Technical Assistance document by use of the following example:

  • Employer administers a personality test to 80 White applicants and 40 Black applicants.
  • 48 of the White applicants and 12 Black applicants advance to the next step in the application process.
  • The selection rate for White applicants is 48/80 = 60% and the selection rate for Black applicants is 12/40 = 30%
  • The ratio of the two rates 30/60 = 50%
  • Because 50% is lower than 4/5 = 80% (the Four-Fifths Rule), the selection rate for Black applicants is substantially different than the selection rate for White applicants. This substantially different percentage rate could be evidence of discrimination against Black applicants.

The EEOC notes that the Four-Fifths rule is a general guideline, and its use may not be appropriate in all circumstances. Additionally, employers should note that compliance with the Four-Fifths rule may not be sufficient evidence to show that the AI selection process is lawful under Title VII when the procedure is challenged. The rule, is, however, a good initial test employers can use to assess selection rates and determine whether additional inquiries are warranted.

Employers may be Responsible for Discriminatory Results of Third-Party Vendor AI Tools

The EEOC’s Technical Assistance document makes clear that “…employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer’s behalf. This may include situations where an employer relies on the results of a selection procedure that an agent administers on its behalf.”

Employers are responsible, at a minimum, to ask the vendor what steps the vendor has taken to evaluate whether the tool or test may cause an adverse disparate impact.  

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© FordHarrison | Attorney Advertising

Written by:

FordHarrison
Contact
more
less

FordHarrison on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide