In her first major remarks as Acting Chairwoman of the Federal Trade Commission (FTC), Rebecca Kelly Slaughter outlined her enforcement priorities under the new administration in a conversation with the Future of Privacy Forum, including oversight and enforcement strategies directly impacting big data and artificial intelligence.
Algorithmic Disgorgement Is Here to Stay
Slaughter underscored last month’s digital photo album settlement—which required the company to destroy all facial recognition models and algorithms built upon improperly collected data—as an “innovative” model for future enforcement actions: “[W]here companies collect and use consumers’ data in unlawful ways: we should require violators to disgorge not only the ill-gotten data, but also the benefits—here, the algorithms—generated from that data.” Slaughter’s comments suggest algorithmic disgorgement is here to stay in the FTC’s arsenal of enforcement mechanisms.
Direct Notice of Wrongdoing Plays a Major Role
Companies facing future FTC orders can expect Slaughter to push FTC staff to include provisions requiring notice as a matter of course. Slaughter explained in her remarks how effective consumer notice of wrongdoing is likely to have greater impact than unknown FTC enforcement orders: “Notice lets consumers ‘vote with their feet’ and helps them better decide whether to recommend the services to others,” but it also “accords consumers the dignity of knowing what happened.”
Big Data May Be Anticompetitive
In recent years, major players in big data digital markets have found themselves in the crosshairs of European competition authorities. Slaughter’s remarks suggest the FTC may take a more aggressive stance on antitrust analysis of big data players by “think[ing] carefully about the overlap between [the FTC’s] work in data privacy and in competition.” She notes that “[m]any of the largest players in digital markets are as powerful as they are because of the breadth of their access to and control over consumer data.”
Algorithmic Bias and Discrimination Must Be Stopped
Emphasizing the importance of racial equity, Slaughter also outlined biased and discriminatory algorithms and facial recognition technologies as two key priority areas for future privacy enforcement at the FTC. Moving forward, Slaughter noted that she has “asked staff to actively investigate biased and discriminatory algorithms,” that she is “interested in further exploring the best ways to address AI-generated consumer harms,” and that the agency will “redouble [its] efforts to identify law violations” in the area of facial recognition technologies.
Stricter Enforcement Posture Forthcoming
Slaughter highlighted the need to think creatively about how to make the FTC’s current enforcement efforts even more effective. She pointed to her previous dissents in FTC cases as examples of where she felt the FTC should have obtained stronger relief for consumers, including by pursuing litigation if the FTC is unable to negotiate sufficient relief in settlement. She further committed to ensuring the FTC is identifying and pleading all violations that are applicable. This means companies may find the FTC less willing to provide flexibility in settlement negotiations under the new administration. With enforcement mechanisms like algorithmic disgorgement and direct consumer notice on the table, this is not a welcome development.
In light of these priorities, it is critical that companies managing big data or developing and deploying artificial intelligence (AI) and machine learning (ML) consider implementing policies, procedures and training designed to ensure:
- Data is properly sourced and collected in compliance with applicable notice, consent and opt-out obligations;
- Consumers are provided proper and effective notice relating to the automated processing of data;
- Requests to exercise applicable consumer rights can be processed with minimal impact on AI/ML technologies;
- AI/ML technologies perform in a fair and lawful manner, including across protected classes; and
- Impacts of large data acquisitions are clearly analyzed under a competition and privacy lens.