EEOC Issues Guidance on the Use of Software, Algorithms, and Artificial Intelligence in Hiring Decisions

Miles & Stockbridge P.C.
Contact

Employers are increasingly relying on electronic systems to supplement – and, at times, supplant – the work of actual human employees relating to certain hiring, retention and employee-management practices. Such systems are often utilized or viewed as a measure to increase efficiency, reduce human error, decrease costs and optimize performance metrics. Of particular note in this arena is algorithmic decision-making tools, such as artificial intelligence (“AI”).

At its essence, AI leverages computers and machines to mimic the problem-solving and decision-making capabilities of the human mind.[1] With this in mind, the Equal Employment Opportunity Commission (“EEOC”) recently issued a new Technical Assistance Document for employers on how to monitor newer algorithmic decision-making tools, such as AI, in order to prevent employment discrimination against job seekers and employees.

Here are some key takeaways from the Technical Assistance Document. The goal in all this is to ensure employers comply with Title VII of the Civil Rights Act of 1964, which prohibits employment discrimination based on race, color, religion, sex (including pregnancy, sexual orientation, and gender identity) or national origin. The statute also prohibits employers from using tests and selection procedures that disproportionately exclude individuals with a protected characteristic if the tests or selection procedures are not job- related and consistent with business necessity pursuant to Title VII. This type of discrimination is also known as “disparate impact” or “adverse impact” discrimination.

Algorithmic Decision-Making Tools Constitute ‘Selection Procedures’ Under Title VII

According to the EEOC, a “selection procedure” is defined as any “measure, combination of measures, or procedure used as a basis for any employment decision.” This includes a full range of different assessment techniques, from traditional paper-and-pencil tests to performance tests, training programs, probationary periods, physical, educational, and work experience requirements and the like. As a result, the EEOC has found that AI-based and algorithmic decision-making tools qualify as selection procedures to the extent they are used to make orinform decisions about whether to hire, promote, terminate or take similar actions toward applicants or current employees. Thus, the use of these tools, like traditional selection procedures, are evaluated using the EEOC’s “Uniform Guidelines on Employee Selection Procedures under Title VII”which provides guidance for employers to determine if their tests and selection procedures are lawful under Title VII.

It is important to note that the employment decision does not need to be made exclusively by the tool. As long as the tool informs the employment decision in some way, it could constitute a selection procedure that must be monitored for purposes of preventing disparate impact discrimination.

Selection Rates Based on AI and Algorithmic Selection Procedures Are Assessed in the Same Fashion as Traditional Selection Procedures

In monitoring algorithmic decision-making tools to prevent disparate impact discrimination, the EEOC guidance makes clear that the same methods used to assess disparate impact in traditional selection procedures are equally applicable to evaluating their emerging counterparts. Specifically, the guidance notes that EEOC will typically assess whether the use of an algorithmic-based decision-making tool results in disparate impact discrimination by determining the “selection rate” for individuals with a protected characteristic.

According to the guidance, “selection rate” refers to the proportion of applicants or candidates who are hired, promoted or otherwise selected for a position out of a pool of candidates. Using this process, disparate impact discrimination is usually found to exist where the selection rate for individuals with a protected characteristic is “substantially less” than the selection rate for individuals outside of a protected class. Thus, if the use of an algorithmic-based decision-making tool results in a substantially lower selection rate for individuals with a protected characteristic, then use of the tool will violate Title VII unless the employer can show that such use is job related and consistent with business necessity, which may be a difficult burden for employers to meet in some circumstances.

Employers May Be Held Liable under Title VII Even if the Algorithmic Decision-Making Tool is Designed or Administered by Another Entity

One of the most striking takeaways from the guidance is EEOC’s position regarding employer liability where the employer utilizes algorithmic-based decision-making tools administered or created by a third party. The EEOC has taken the position that employers using algorithmic-based decision-making tools may be liable for discrimination under Title VII even if their tool was developed by an outside vendor or is being administered by a third party on the employer’s behalf. Further, and perhaps more concerning, is the fact that an employer may even be held liable when a third-party vendor provides incorrect or false statements about their tool’s Title VII compliance to the employer, and the use of the tool, unbeknownst to the employer, results in disparate impact discrimination.

Going Forward

Employers should be mindful of the effect their algorithmic-based or other electronic tools may have on their compliance with Title VII and state and local anti-discrimination laws, if applicable. In this regard, the EEOC encourages employers to undertake the following efforts to maintain compliance with Title VII:

  • Conduct self-analyses on an ongoing basis to determine whether employment practices have a disproportionately negative effect on members of a protected class or treat protected groups differently.
  • In deciding whether to rely on a software vendor to develop or administer an algorithmic decision-making tool, ask the vendor if steps have been taken to evaluate whether use of the tool causes a substantially lower selection rate for individuals with a characteristic protected by Title VII. Employers may also consider asking the vendor what standard it utilizes for determining whether its tool causes a substantially lower selection rate, such as whether it relies on the four-fifths rule of thumb[2] or the statistical significance standard often used by courts.
  • If developing a selection tool and it appears the tool would have an adverse impact on individuals within a protected class, take steps to reduce the negative impact or select a different method or tool to avoid engaging in a practice that violates Title VII.

Opinions and conclusions in this post are solely those of the author unless otherwise indicated. The information contained in this blog is general in nature and is not offered and cannot be considered as legal advice for any particular situation. The author has provided the links referenced above for information purposes only and by doing so, does not adopt or incorporate the contents. Any federal tax advice provided in this communication is not intended or written by the author to be used, and cannot be used by the recipient, for the purpose of avoiding penalties which may be imposed on the recipient by the IRS. Please contact the author if you would like to receive written advice in a format which complies with IRS rules and may be relied upon to avoid penalties.


[1] https://www.ibm.com/topics/artificial-intelligence#:~:text=Artificial%20intelligence%20leverages%20computers%20and,capabilities%20of%20the%20human%20mind

[2] The four-fifths rule of thumb is a general rule of thumb for determining whether the selection rate for one group is “substantially” different than the selection rate of another group. The rule states that one rate is substantially different than another if their ratio is less than four-fifths or 80%.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Miles & Stockbridge P.C. | Attorney Advertising

Written by:

Miles & Stockbridge P.C.
Contact
more
less

Miles & Stockbridge P.C. on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide