New EEOC Initiative Targeting AI Technology Used in Hiring Process Presents Increased Risk of Class Litigation

Fox Rothschild LLP
Contact

Fox Rothschild LLP

The Equal Employment Opportunity Commission (EEOC) and Department of Justice (DOJ) have recently announced plans to monitor employers’ use of artificial intelligence (AI) tools in hiring decisions that may discriminate against applicants with disabilities. On May 12, 2022, both agencies issued guidance outlining areas for concern regarding the use of AI in hiring decisions, which provide some insight into their intentions for litigation.

Employers are increasingly using AI and other technologies to assist with employment decisions, including screening and hiring employees, monitoring their performance, and assessing pay or promotions for current employees. For example, many companies use computer-based tests or software to screen or score applicants for positions. In their recent announcements, the agencies warn that screening/assessment technologies may violate the Americans with Disabilities Act (ADA) if they do not provide an avenue for individuals with disabilities to request reasonable accommodations. For example, employers using AI decision-making tools may not have systems in place to offer reasonable accommodations to workers. The agencies raise concerns that, without proper safeguards, workers with disabilities may be screened out of jobs or promotions even if they could do the job with reasonable accommodation. Without proper care, the use of AI may also elicit information about disabilities, medical conditions or medical restrictions from individuals that could result in prohibited disability-related inquiries or medical exams.

The agencies have flagged the following examples of technology/AI that might be potentially discriminatory:

  • Facial and voice analysis technologies that evaluate applicants’ skills and abilities may adversely affect people with autism or speech impairments
  • Computer-based tests that require applicants to watch a computer screen may adversely affect people with vision impairments who would still be able to do the job
  • A timed math test where applicants submit their answers by typing on a keyboard could adversely affect individuals with arthritis who cannot type quickly, but would otherwise score well on the math test
  • AI features that automatically screen out candidates who say they cannot stand for three hours straight would adversely affect people who use wheelchairs who could perform the job if allowed to remain in their wheelchair instead of stand; and
  • Chatbots that automatically screen out resumes with significant gaps in an applicant’s employment history without giving the applicant a chance to explain that the gap may have been due to a disability that required extensive medical treatment.

In addition to its news release, the EEOC also released a technical assistance document that outlined steps employers can take to minimize the risk of discrimination through the use of these types of AI tools and some additional tips for applicants and employees.

Some of the pointers:

  • Tell applicants before they apply about the type of technology being used in the application and screening process and how the applicants will be evaluated

  • Provide enough information to applicants about what assessments or screening procedures will take place so that they may decide whether to seek a reasonable accommodation
  • Clearly advertise that applicants may request accommodations if they choose
  • Implement procedures for requesting reasonable accommodations and make sure that asking for one does not hurt an applicant’s chances of getting hired
  • Train staff members to recognize and process requests for reasonable accommodations promptly
  • If using a vendor’s services, ask the vendor if the software was developed and/or tested with individuals with disabilities in mind
  • Ask third-party vendors to forward all requests for reasonable accommodations promptly to be processed by the employer
  • Ensure that technologies that screen out applicants are only screening out applicants who would be unable to perform the job even with reasonable accommodations
  • Avoid using AI tools that make decisions based on abilities and qualifications that are not necessary to perform the job at issue

This new strategic initiative paves the way for large-scale litigation by the EEOC and DOJ that could have major impacts on employers who use AI to screen applicants and employees. In particular, the EEOC may pursue AI-based discrimination claims against employers under a disparate impact theory. In this type of claim, the EEOC would not need to show that an employer intended to discriminate against disabled individuals. Instead, the EEOC could argue that employers’ AI tests or criteria have the effect of screening out disabled individuals. To defend against such a claim, an employer would have to show that any AI tests or criteria are job-related and a business necessity.

Employers should consider auditing their Human Resources processes to get ahead of potential claims by the EEOC. Employers also cannot simply rely on their vendors to either shoulder the risks or bring their technology into compliance: The ultimate burden rests with the employer to comply with the ADA.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Fox Rothschild LLP | Attorney Advertising

Written by:

Fox Rothschild LLP
Contact
more
less

Fox Rothschild LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide