Is Artificial Intelligence Sexist and Racist?

Foley & Lardner LLP
Contact

Last year, Amazon scrapped its machine-learning algorithm because it discovered it had a major problem—the artificial intelligence didn’t like women. The machine-based learning tool was designed to analyze resumes and compare potential applicants to Amazon’s current work force. The algorithm was designed to take 100 resumes and filter out the top five applicants.

The problem was that there is a pre-existing gender gap in software developer and other technical posts. Therefore, when the artificial intelligence tool analyzed the patterns in Amazon’s hiring practices over the prior 10-year period, it taught itself to favor men over women. Amazon ultimately disbanded the tool.

Amazon’s artificial intelligence highlights an important limitation on machine-based learning tools—the tools are only as good as the information they are given. While artificial intelligence can quickly and more efficiently screen potential job candidates, such algorithms can inadvertently reinforce discrimination in hiring practices. In Amazon’s case, tech-based job applicants were more likely to be male than female. The algorithm mistakenly interpreted this gender gap as a hiring preference for Amazon. Thus, instead of highlighting the qualified women, the algorithm screened out such candidates.

Employers these days have a panoply of tech-based tools at their disposal. Websites like Monster.com and Indeed.com advertise job openings and generate large numbers of applicants. Employers are turning to tech-based tools to reduce the time to hire and the costs of hiring. Such tech- based tools, however, are designed to mimic human decision-making. Therefore, when the tool relies on data that is inaccurate or biased, the tool can inadvertently discriminate against women or minorities. Studies have also found that tech-based tools can discriminate in more subtle ways as well. For example, an employer attempting to maximize work tenure found that those who lived closer to work tended to have longer tenures. However, screening applicants based on how close they lived from work tended to disproportionately screen out certain minority candidates.

Under Title VII of the Civil Rights Act of 1964 and analogous state and local laws, the employer is responsible for ensuring that it is screening job applicants in a nondiscriminatory manner. Therefore, if you are using or considering a tech-based tool to help you screen job applicants, you should take steps to ensure that such tools are not disproportionately screening out candidates based on gender, race, or other protected classes. Simply telling tech-based tools not to discriminate against minorities or women may be insufficient because such tools will attempt to identify candidates that reflect your existing hiring practices. Some helpful tips to consider when using tech-based hiring tools are:

  1. Do not rely exclusively on tech-based hiring tools. Most tools will rank candidates. Employers should review lower-ranked candidates and make independent assessments based on non-discriminatory criteria.
  2. Consistently review and update data provided to your hiring tool. Make sure the data your hiring tool relies on does not reflect discriminatory hiring practices.
  3. Independently audit the results and rankings generated by the hiring tool and make appropriate adjustments as necessary.

Over time, these tech-based hiring tools will likely improve and, hopefully, screen applicants free of any discriminatory bias. But until the technology is perfected, employers should take steps to make sure that members of protected classes are not disproportionately screened through uses of tech-based hiring algorithms.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Foley & Lardner LLP

Written by:

Foley & Lardner LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Foley & Lardner LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide