Dechert Cyber Bits - Key Developments in Privacy & Cybersecurity - Issue 47

Dechert LLP

Articles in this issue

  • FTC Settles with Rite Aid on its Use of AI and Processing of Biometric Information
  • FTC Settlement with Data Broker Addresses Sensitive Location Information
  • The FTC Proposes Changes to COPPA’s Protections for Children
  • AI Cannot Be the Inventor of a UK Patent
  • Dechert TidBits

FTC Settles with Rite Aid on its Use of AI and Processing of Biometric Information

The Federal Trade Commission (“FTC”), on December 19, 2023, announced that it had reached a settlement with Rite Aid Corporation (“Rite Aid”) to resolve allegations that, among other things, Rite Aid had violated Section 5 of the FTC Act for failing to take reasonable measures to prevent harm to consumers through its use of artificial intelligence-based facial recognition technology (“FRT”). According to the Complaint for Permanent Injunction and Other Relief (“Complaint”), from 2012 to 2020, Rite Aid implemented FRT across hundreds of its stores to identify customers suspected of having previously engaged in shoplifting or other criminal behavior at its stores. The FRT would, without providing notice to visitors or obtaining their consent, scan store visitors’ biometric information as they entered the retail locations to assess whether it matched with “persons of interest” identified on the FRT’s “watchlist database.” If a match occurred, Rite Aid’s employees would be notified and were required to take action, which included confronting the individual, calling the police, or removing the customer from the store. The FTC alleged that the FRT caused thousands of false-positive matches to be generated, especially for black, Asian, Latino, and women patrons, causing them substantial injury and distress. The safeguards the FTC alleged Rite Aid failed to implement to protect consumers included: (i) assessing, considering, or taking reasonable steps to mitigate risks related to misidentification particularly heightened risk to consumer due to their race or gender; (ii) testing the accuracy of its facial recognition system before deploying it; and (iii) training employees; and (iv) preventing the use of low-quality images.

The proposed Stipulated Order for Permanent Injunction and Other Relief (“Proposed Order”), if entered, would prohibit Rite Aid from using a facial recognition or analysis system for surveillance purposes for five years, and will require, among other things, that Rite Aid delete stored biometric information that was gathered through its FRT and any algorithms derived from it. Rite Aid would also be required to ensure that any third parties with which it shared such data do the same. Rite Aid would also need to meet prescriptive requirements to be able to lawfully implement a biometric information system or surveillance system in the future, which would include risk assessments, specific notice and consent requirements, and being able to demonstrate that any program it deploys is backed by scientific evidence. Rite Aid did not admit any wrongdoing in connection with the FTC’s allegations, noting that the “allegations relate[d] to a facial recognition technology pilot program the company deployed in a limited number of stores,” which Rite Aid asserts it stopped using three years prior to the FTC’s investigation. Its press release is here.

Takeaway: After the FTC’s May 2023 warning about the misuse of biometrics information and multiple blog posts on AI, it was only a matter of time before the FTC brought an enforcement action combining two of its key areas of focus. As in the 2021 Everalbum case, the FTC has again shown that one of its preferred remedies in cases involving AI and biometrics is algorithmic disgorgement. The nature of the penalties in the Rite Aid action, and Commissioner Bedoya’s depiction of the Proposed Order described as “a baseline for what a comprehensive algorithmic fairness program should look like” means that, to stay out of the soup, companies will want to look to the Order requirements as a guide for how to proceed when deploying biometrics surveillance programs and training AI.

 

FTC Settlement with Data Broker Addresses Sensitive Location Information

On January 9, 2023, the Federal Trade Commission (“FTC”) announced that it had reached a settlement with location data brokers X-Mode Social, Inc. and its successor Outlogic, LLC (together, “X-Mode”) to resolve allegations that X-Mode violated Section 5 of the FTC Act’s prohibition against unfair and deceptive acts or practices regarding its collection and sale of sensitive location data. According to the FTC’s complaint (“Complaint”), X-Mode sold raw location data associated with mobile advertising IDs (“MAIDs”), which its customers could use to match a consumer with the locations they visited, including sensitive locations. In addition, among other things, the Complaint alleges that X-Mode: (i) did not honor Android users’ opt-out requests; (ii) failed to notify users of the purposes for which their location data would be used; (iii) failed to verify that third-party apps incorporating its software development kit (sometimes called an “SDK,” which is a set of platform-specific building tools for developers) had obtained informed consent from consumers to have their location data collected, used, and sold; (iv) improperly categorized consumers based on sensitive characteristics for marketing purposes; and (v) did not implement “reasonable or appropriate safeguards” against downstream use of precise location information. X-Mode has not admitted any wrongdoing in connection with the settlement of these allegations.

Under the FTC’s proposed decision and order (“Proposed Order”), X-Mode would be prohibited from using, selling, or disclosing sensitive location data, subject to certain exceptions. X-Mode would also need to comply with other requirements, which would include: (i) creating a program to ensure it develops and maintains a comprehensive list of sensitive locations and ensure it is not sharing, selling or transferring location data about such locations; (ii) deleting or destroying all location data it previously collected and any products produced from such, unless it obtains consumer consent or ensures the data has been deidentified or rendered non-sensitive; (iii) developing a supplier assessment program to ensure that companies that provide location data to X-Mode obtain informed consent from consumers; (iv) implementing procedures to ensure that recipients of its location data do not associate the data with locations that provide services to LGBTQ+ people, with locations of public gatherings of individuals at political or social demonstrations or protests, or use location data to determine the identity or location of a specific individual; and (v) providing a simple way for consumers to withdraw their consent for the collection and use of their location data and for the deletion of any location data that was previously collected.

Takeaway: This enforcement action demonstrates the FTC’s priority in policing the sale and sharing of sensitive personal information and, in particular, sensitive location data, and solidifies the FTC’s concern around consumers being marketed to based on sensitive characteristics. Companies that collect precise geolocation data should review their practices to determine whether they are obtaining affirmative express consent from consumers to the collection and, if relevant, the onward use of their information. Even if a company receives the data “downstream,” they may not be able to rely solely on the contractual representations of the data supplier. It’s a good idea to review your processes now to determine whether the company is taking reasonable steps to verify that the consumers in question provided their informed consent and review how the data supplier collects appropriate opt-in consents from users.

 

The FTC Proposes Changes to COPPA’s Protections for Children

In a notice of proposed rulemaking, the Federal Trade Commission (“FTC”) is seeking comments regarding its proposed changes to the COPPA Rule—which are aimed at addressing the evolving ways children’s personal information is being collected, used, and disclosed. The FTC has proposed several changes to the COPPA Rule, including, among other things:

  • Separate Opt-In for Targeted Advertising: Requiring COPPA-subject websites and operators to obtain separate verifiable parental consent to disclose information to third parties, including third-party advertisers;
  • Data Security Requirements: Mandating that operators establish, implement, and maintain a written information security program that contains safeguards that are appropriate to the sensitivity of children’s personal information;
  • Data Retention Limits: Limiting the retention of children’s personal information to only for as long as is necessary to fulfill the specific purpose for which it was collected, and not for any secondary purpose, and requiring operators to establish, and make public, a written data retention policy for personal information collected from children; and
  • Conditional Participation: Reinforcing the prohibition on conditioning participation in an activity (e.g., game, prize offering, etc.) on the collection of personal information, making clear that there is an outright ban on collecting more personal information than is reasonably necessary for a child to participate in an activity.

The notice of proposed rulemaking includes a list of targeted questions for consideration by the public. The FTC has asked for comments to be submitted by March 11, 2024.

Takeaway: While the proposed changes to the COPPA Rule are designed to address rapid changes in technology since the last overhaul in 2013, the FTC’s proposals also represent the latest in a series of developments demonstrating a rising legislative and regulatory interest in children’s privacy. Businesses subject to COPPA will want to consider how they would comply with the proposed new requirements, if instituted. Where there are proposed compliance areas that are not practical or technologically feasible, consider submitting comments to the FTC on those issues.

 

AI Cannot Be the Inventor of a UK Patent

The UK Supreme Court held that a machine cannot be an ‘inventor’ under UK patent legislation confirming that “[i]f patents are to be granted in respect of inventions made by machines, the 1977 [UK Patents] Act will have to be amended.”

The patent applicant, Dr. Stephen Thaler, designated his AI system (known as ‘DABUS’) as the inventor of patents filed in numerous jurisdictions as part of his ‘Artificial Inventor Project’ (see our previous OnPoint). As a result, patent offices and courts around the world have been grappling with the question of whether an AI system can be the inventor of a patent.

This issue has now been determined by the UK’s most senior court. It confirmed that under the UK’s patents regime: (a) inventors must be human, (b) only legal or natural persons can obtain a patent, and (c) the ‘doctrine of accession’, under which a new item produced by an existing item (such as a calf produced by a farmer’s cow) is owned by the owner of the existing item, does not apply where the new creation is an invention. By designating DABUS as the inventor, Dr. Thaler failed to comply with the inventor identification requirement and the UK Intellectual Property Office was correct to refuse his patent applications.

Takeaway: Importantly, Dr. Thaler maintained that DABUS devised the claimed invention autonomously. Had he claimed that he was himself the inventor and devised the invention using DABUS as a tool, the outcome of his patent applications may well have been different. As AI distances the inventive process from the humans with responsibility for the research, contracts for the use of AI for R&D should pre-empt uncertainty and include more detailed IP ownership provisions than might be expected in a traditional software licence or SaaS contract.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dechert LLP | Attorney Advertising

Written by:

Dechert LLP
Contact
more
less

Dechert LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide