Dodging Disparate Impact Claims

Fisher Phillips
Contact

There has been significant buzz lately regarding the risk of discrimination in the sharing economy. Not only has the Equal Employment Opportunity Commission (EEOC) published its intent to prioritize protections in the on-demand economy in its recently published Strategic Enforcement Plan, but sharing economy businesses have faced additional scrutiny surrounding response times to customers of different races. 

As the industry grows, academics and the government alike are likely to keep examining the operation of sharing businesses. For example, a recent study out of Northeastern University suggests a negative correlation between the sharing economy worker’s gender and perceived race and their rating scores. The researchers did not find evidence of deliberate bias by the companies, but did observe bias in customer feedback.

“Disparate impact” discrimination arises when an employment practice that is neutral on its face – for example, requiring that potential employees have a college degree – results in a discriminatory impact on particular protected groups. For example, suppose that the job in question is low-skilled enough that a college degree is not rationally necessary to perform the work, and the otherwise-neutral screening requirement eliminates a high percentage of applicants from a certain race or national origin. To make a case for disparate impact discrimination, the employee must prove (1) the existence of a policy or practice; (2) that policy or practice has a statistical adverse impact; and (3) a particular protected class is negatively impacted.  he employer then has the burden to prove that the neutral policy is job related for the position and consistent with business necessity. Because they attack a neutral policy potentially affecting many applicants or other individuals, disparate impact claims are often brought as class actions.

As the EEOC explores how to protect against discrimination in the sharing economy, it is wholly possible that it will utilize disparate impact class litigation to do so. To guard against a disparate impact claim, sharing platforms should evaluate the impact of their algorithms on the workers and consider the following:

  • Are there any statistically significant trends impacting different protected categories (e.g., specific races, national origin, or sex)?
  • Which factors impact those results?
  • How do those metrics relate to the business?
  • Are there other business-related metrics that can be measured that do not have an adverse impact?

Just to make the task extra-challenging, sharing platforms should be careful not to simply implement quick fixes when correcting for disparate impact. In a highly publicized U.S. Supreme Court case, the Court found that an employer was acting in a discriminatory manner when it threw out promotion test results that disproportionally benefited Caucasian employees. Ricci v. DeStefano, 557 U.S. 557 (2009). So, instead of artificially boosting the ratings of a particular underperforming class (which would be direct discrimination), you should instead examine what factors you are or are not taking into consideration that is leading to the disparate impact. Once you are confident you are using only business and task-related criteria, that’s when you should aim to eliminate any disparate impact.

 

Written by:

Fisher Phillips
Contact
more
less

Fisher Phillips on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide