EEOC Issues New Guidance on the use of Software, Algorithms, and Artificial Intelligence on Employment Selection Procedures

Saiber LLC
Contact

The new technical assistance document is focused on preventing discrimination against job applicants and employees with respect to employment decisions made with the assistance of automated systems

By now, most people have probably heard of Open AI’s ChatGPT system, which is capable of answering questions, summarizing research papers, and even writing stories and code.  It is perhaps also notorious for generating misleading or false information at times.  In fact, the United States District Court for the Southern District of New York issued an Order to Show Cause to one attorney as to why he should not be sanctioned when he submitted to the Court “bogus judicial decisions with bogus quotes and bogus internal citations” generated by ChatGPT.  When the attorney asked the chatbot whether one of the cases was “real”, it simply lied and said “Yes . . . [it] is a real case.”

As the above example demonstrates, AI systems are not infallible and may result in consequences if not used responsibly.  As employers begin to increasingly rely on the use of automated systems, including those with AI, to assist with recruitment, performance monitoring, promotions, and other employment decisions, questions invariably arise concerning whether such systems may adversely and disparately impact individuals in certain protected classes and, thus, violate existing civil rights laws, such as Title VII.

To assist employers, the EEOC published a new technical assistance document on May 18, 2023, which is focused on preventing discrimination against job applicants and employees when using automated systems to assist with employment decisions.  The guidance is specifically focused on avoiding “disparate impact” through the use of automated systems, or a disproportionately large negative effect on a basis that is prohibited by Title VII.

The guidance is presented in a question-and-answer format and answers questions employers might have concerning the use of algorithmic decision-making tools. Below is a sampling of questions and answers included in the technical assistance document:

  • Is an employer responsible under Title VII for its use of algorithmic decision-making tools even if the tools are designed or administered by another entity, such as a software vendor?In many cases, yes. For example, if an employer administers a selection procedure, it may be responsible under Title VII if the procedure discriminates on a basis prohibited by Title VII, even if the test was developed by an outside vendor. In addition, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer’s behalf. This may include situations where an employer relies on the results of a selection procedure that an agent administers on its behalf.
  • What is a “selection rate”?“Selection rate” refers to the proportion of applicants or candidates who are hired, promoted, or otherwise selected. The selection rate for a group of applicants or candidates is calculated by dividing the number of persons hired, promoted, or otherwise selected from the group by the total number of candidates in that group. For example, suppose that 80 White individuals and 40 Black individuals take a personality test that is scored using an algorithm as part of a job application, and 48 of the White applicants and 12 of the Black applicants advance to the next round of the selection process. Based on these results, the selection rate for Whites is 48/80 (equivalent to 60%), and the selection rate for Blacks is 12/40 (equivalent to 30%).
  • What is the “four-fifths rule”?The four-fifths rule, referenced in the Guidelines, is a general rule of thumb for determining whether the selection rate for one group is “substantially” different than the selection rate of another group. The rule states that one rate is substantially different than another if their ratio is less than four-fifths (or 80%).In the example above involving a personality test scored by an algorithm, the selection rate for Black applicants was 30% and the selection rate for White applicants was 60%. The ratio of the two rates is thus 30/60 (or 50%). Because 30/60 (or 50%) is lower than 4/5 (or 80%), the four-fifths rule says that the selection rate for Black applicants is substantially different than the selection rate for White applicants in this example, which could be evidence of discrimination against Black applicants.

The full list of questions and answers can be accessed here.  Although the guidance does not establish new policy and does not have the force and effect of law, it does provide some additional clarity regarding existing requirements under the law.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Saiber LLC | Attorney Advertising

Written by:

Saiber LLC
Contact
more
less

Saiber LLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide