Who's Liable for Deepfakes? FTC Proposes To Target Developers of Generative AI Tools in Addition to Fraudsters

Davis Wright Tremaine LLP
Contact

Davis Wright Tremaine LLP

Proposal expands impersonation fraud rule to cover individuals and potentially extend liability to tech companies deploying AI tools used to create deepfakes or voice cloning

Late last week, the FTC released a notice seeking comment on a proposed rule that could create potential liability for generative AI developers. Specifically, the agency is requesting comments on a new rule that would prohibit as an unfair and deceptive practice under Section 5 of the FTC Act the fraudulent impersonation of individuals, and extend potential liability for companies providing the "means and instrumentalities" of such fraudulent activity. The proposed rules extend similar protections for individuals to expand on the agency's recently adopted rules prohibiting the impersonation of government and business entities and their officials or agents.

More significantly, the proposed rule would extend potential liability for impersonation fraud to companies providing goods or services (the "means and instrumentalities" for others to engage in impersonation fraud) when the provider has knowledge or "reason to know" that the AI will be used by bad actors to "materially and falsely pose as" an individual in connection with commerce. For example, a developer of generative AI could face potential FTC enforcement action under this rule if the developer knew or had reason to know that a third party will use the genAI system (or its output) in a fraudulent scheme to pose as an individual, or materially misrepresent an affiliation with, or endorsement or sponsorship by the impersonated individual. In its release, the FTC explained it is seeking comment on whether the rule should make it unlawful for a firm, "such as an AI platform that creates images, video, or text, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation."

Examples of potentially fraudulent practices under the proposed rule

  • Calling, messaging, or otherwise contacting a person or entity while posing as an individual (or affiliate), including by identifying an individual by name or by implication;
  • Sending physical mail through any carrier using addresses, identifying information, or insignia or likeness of an individual;
  • Creating a website or other electronic service or social media account impersonating the name, identifying information, or insignia or likeness of an individual;
  • Creating or spoofing an email address using the name of an individual;
  • Placing advertisements, including dating profiles or personal advertisements, that pose as an individual or affiliate of an individual; and
  • Using an individual's identifying information, including likeness or insignia, on a letterhead, website, email, or other physical or digital place.

"Strengthening the FTC's toolkit"

This proposal is being driven, in part, by the FTC's concerns that deepfakes, voice cloning, and other AI-generated content is facilitating an increase in fraudulent activity. As the FTC noted in its release to the proposed rule, fraudsters are increasingly using AI tools "to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever. Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC's toolkit to address AI-enabled scams impersonating individuals."

Looking Ahead

The comment deadline will be set once the proposed rulemaking is published in the Federal Register. It will take some time for the agency to receive comments, develop a record, and issue a final rule.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Davis Wright Tremaine LLP | Attorney Advertising

Written by:

Davis Wright Tremaine LLP
Contact
more
less

Davis Wright Tremaine LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide