NYC’s Task Force to Tackle Algorithmic Bias: A Study in Inertia

Patrick Law Group, LLC
Contact

In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies. Local Law 49 created an Automated Decision Systems Task Force (“Task Force”) to monitor algorithms used by municipal agencies, and directed the Task Force to provide recommendations by the fall of 2019 as to how to make algorithms used by the City fairer and more transparent.  Despite the Task Force’s daunting task of balancing the need for more transparency with respect to the City’s use of algorithms against the right of companies to protect their intellectual property, many people were hopeful that Local Law 49 would encourage other cities and states to acknowledge the problem of algorithmic bias.

The legislation arose after then-Councilman James Vacca read a ProPublica investigation about a computer algorithm used to score a criminal defendant’s risk of recidivism.  The ProPublica investigation found that the most widely used risk assessment (COMPAS) was more likely to erroneously identify black defendants as presenting a high risk for recidivism at almost twice the rate as white defendants (43 percent vs 23 percent).  In addition, ProPublica’s research revealed that COMPAS risk assessments erroneously labeled white defendants as low-risk 48 percent of the time, compared to 28 percent for black defendants.  Black defendants were also 45 percent more likely to receive a higher risk score than white defendants, even after controlling for variables such as prior crimes, age and gender. 

The original bill proposed by Vacca and inspired by the ProPublica investigation was much more ambitious than the final bill, and would have required city agencies to publish the source code of certain algorithms that imposed  penalties on individuals.  Nevertheless, Local Law 49 was seen by many in the industry as a promising first step in mitigating algorithmic bias, and an opportunity to bring fairness and accountability to the use of such automated decision systems.  

Unfortunately, over a year later, the Task Force has failed to make meaningful progress to fulfill its mission.  First, although the law includes a definition of “automated decision system” (“ADS”) and the Task Force has held approximately 18 meetings, they have been unable to reach a consensus as to what technology even meets the definition of ADS.  Without agreement as to what systems even fall under the purview of the Task Force, the odds of the Task Force providing specific recommendations by this fall is unlikely.

In addition, Local Law 49 did not mandate that City agencies provide the Task Force with requested information and made the provision of requested information voluntary. The joint written testimony of Task Force members Julia Stoyanovich and Solon Barocas stated that despite numerous requests, as of April 4, 2019, the City had not identified even a single ADS or provided any information about ADS used by the City.

Task Force members have stressed that they will be unable to provide meaningful or credible recommendations as to how to address algorithmic discrimination if the City fails to provide requested information, and stated that their need for examples of ADS used by the City has been raised on numerous occasions. In addition, Task Force members noted that they had developed a promising methodology for obtaining relevant information about ADS from developers and operators, however, the City inexplicably directed the Task Force to abandon their use of this methodology.

At the Task Force’s first public hearing on April 30, 2019, Task Force members declined to commit to providing recommendations specific to ADS currently used by the City and noted that the City has not been cooperative with the Task Force’s inquiries.  The Task Force will hold its second public hearing on May 30, 2019.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Patrick Law Group, LLC | Attorney Advertising

Written by:

Patrick Law Group, LLC
Contact
more
less

Patrick Law Group, LLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide