New York City delays enforcement of its artificial intelligence bias audit in employment law as rule-making continues

Eversheds Sutherland (US) LLP

New York City (NYC) has delayed to April 15, 2023 the enforcement of its first-of-its-type law on bias in artificial intelligence (AI) tools used in employment. Local Law 144 of 2021 prohibits employers in NYC from using artificial intelligence (specifically referred to as “automated employment decision tools,” or AEDTs) to screen candidates for hiring or promotion unless the employers first conduct an audit to determine whether there is bias present in the tool. The audit must be conducted by an independent auditor that has no prior connection to either the AEDT or the employer or vendor. Employers must notify candidates that they use an AEDT, which qualifications the AEDT assesses, the types and sources of data the business collects for the AEDT, and its data retention policy, and must provide the candidates with an opportunity to request an alternative selection process or accommodation, if available. The employer must also publish the results of the bias audit on its website.

On September 19, 2022, the New York City Department of Consumer Affairs and Worker Protection (DCWP) issued proposed rules aimed at clarifying and expanding on the law. On December 23, 2022, DCWP released Revised Proposed Rules in response to the high volume of comments DCWP received on their proposal. The Revised Proposed Rules made some significant changes to the initial rule proposal. As of the date of this alert, the rules have not been finalized. 

Under the Revised Proposed Rules, the AEDT audit must calculate the selection rate, based on gender and race/ethnicity, for each category in the Equal Employment Opportunity Commission Employer Information Report (EEO-1), including all possible intersections of gender, race, and ethnicity. The audit must compare the results of the selection rate to the most selected category to determine an impact ratio. The impact ratio is essentially a score intended to show whether the AEDT selects individuals from one or more races/ethnicities and/or genders at a statistically significant rate, which could imply that it is exhibiting bias based on a protected class. The calculations provided in the Revised Proposed Rules generally follow the widely accepted EEOC’s Uniform Guidelines on Employee Selection Procedures. However, unlike the EEO-1 reports, where employers can deduce gender, race, or ethnicity based on visual observation when an employee fails to provide the data, it is unclear how employers are expected to tackle missing data for purposes of the AEDT audit. Further, it is unclear how employers should address small data sets that could lead to skewed statistical analyses.

An independent auditor must conduct the bias audit, but multiple organizations can use the same bias audit, so long as each employer provides historical data (as defined in the rules) to the independent auditor. The AEDT vendor can hire an independent auditor to review its AEDT, and it can provide the audit to organizations that wish to use the tool.

The law was originally intended to go into effect on January 1, 2023, but enforcement has been delayed until April 15, 2023 as rule-making around the law continues. The Revised Proposed Rules, for instance, clarify some key terms, such as expounding on the definition of AEDT, which is currently defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” The Revised Proposed Rules explain that “to substantially assist or replace discretionary decision making” means

  1. to rely solely on a simplified output (score, tag, classification, ranking, etc.), with no other factors considered;
  2. to use a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set;
  3. to use a simplified output to overrule conclusions derived from other factors including human decision-making.

Whether additional changes and clarifications will be made to the Revised Proposed Rules remains to be seen.

Enforcement

The NYC DCPW can enforce the law and issue fines between $500 and $1,500 per violation, per day. The law also provides a private right of action for employees and candidates. 

Jurisdictional Reach

There are still open questions as to the jurisdictional reach of the law. Employers located within New York City must conduct bias audits of AEDTs and post the results of those audits to their websites. However, employers are only obligated to notify employees or candidates who reside in the city that the tool will be used in connection with their application. Employers must also inform NYC resident candidates as to the job qualifications and characteristics that the tool will be used to screen for, as well as the type and source of data they collect for use with the AEDT. Thus, it appears that the law would establish different obligations for employers depending on whether their potential candidates reside within or outside the city limits. It is also unclear whether the law applies to non-NYC employers who target NYC residents for hiring or promotion.

AI in Hiring

AI in hiring has progressed far past automated résumé keyword checks. While AI is widely used to sort and rank candidates based on the contents of their resumes, advanced AI interview software can assess intangible features like speech patterns, body language, and word choice. It can be used to judge candidates’ responses to emotional queues to determine whether they have the personality traits the employer views as valuable. Game-based AI tools test how a person plays a game and compares their behavior with how successful people at the company played the same game.

The idea is that the AI will learn which traits successful employees possess so it can identify similar traits in candidates for jobs and promotion. The concern is that it may identify the wrong traits, which can result in bias as to protected classes and perpetuate existing inequities. For instance, if AI used facial recognition to identify successful employees, and they all had blond hair, it could identify blond hair as a positive trait and filter on that basis.

Regulatory Trends

The Equal Employment Opportunity Commission launched an initiative in 2021 to examine how to prevent discrimination in AI, and held a hearing in January 2023 to examine how to prevent unlawful bias in the use of AEDTs. Further, the US Congress introduced the Algorithmic Accountability Act of 2022, which would direct the FTC to create regulations requiring companies to conduct impact assessments for systems that make automated decisions. NYC’s AEDT law comes on the heels of a clear increase in state regulation surrounding the use of AI in hiring. Illinois has enacted a law requiring employers to obtain informed consent from applicants whose video recordings are analyzed by AI for the purposes of hiring. The law attempts to guard against bias by requiring employers to collect and submit: “(1) the race and ethnicity of applicants who are and are not afforded the opportunity for an in-person interview after the use of artificial intelligence analysis; and (2) the race and ethnicity of applicants who are hired.” Maryland enacted a similar law in 2020. And the list continues to grow. For example, the California Privacy Rights Act, which took effect on January 1, 2023, and is still going through the rulemaking process, is expected to address AI and notice requirements. And, New Jersey has introduced a bill that is similar to the initial draft of NYC’s AEDT law.

Implications

The value of AI is that it can analyze massive datasets much faster than humans can, which allows it to consider many more factors, all at the same time, when making a decision. While AI tools for hiring could produce a significant cost and time savings for employers, the technology can also be a target for litigation based on several factors, including the potential for discrimination on the basis of race or gender. On the other hand, some claim that AI may actually reduce discrimination by removing unconscious human biases.

Organizations have several options to limit their potential exposure when using AI tools in the hiring process: independently test and monitor the AI tools on an ongoing basis and document that there is no material bias; require an AI vendor to conduct similar independent bias testing and to provide indemnity if their testing is wrong; shift liability to an insurer; or, in jurisdictions that do not regulate bias, accept the potential legal and reputational risks in light of the potential benefits.

NYC employers should carefully assess whether they use tools that could meet the NYC law’s definition of an AEDT, as its scope may be broader than one would expect. Under the definition above, widely used software that filters out resumes based on keywords could qualify as an AEDT. 

Employers should stay attuned to this rapidly developing regulatory area, working with their trusted advisors to organize their data, their notice and their documentation processes; to inventory and audit their use of AEDTs; and incorporate counsel in the bias audit and governance processes to ensure attorney-client privilege protection where possible. 

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Eversheds Sutherland (US) LLP | Attorney Advertising

Written by:

Eversheds Sutherland (US) LLP
Contact
more
less

Eversheds Sutherland (US) LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide