AI-Generated Robocalls Illegal Under the TCPA

Cooley LLP
Contact

Cooley LLP

In a unanimous, bipartisan decision, the Federal Communications Commission (FCC) has issued a declaratory ruling to confirm that artificial intelligence-generated voices are “artificial” under the Telephone Consumer Protection Act (TCPA). The FCC’s decision comes on the heels of an FCC cease-and-desist letter issued to a company that originated robocalls to New Hampshire presidential primary voters using an AI-generated voice that sounded like President Joe Biden, and just a week after FCC Chairwoman Jessica Rosenworcel issued a press release saying she had proposed to “make AI voice-generated robocalls illegal.”

FCC’s AI inquiry

The TCPA protects consumers from unwanted calls made “using an artificial or prerecorded voice to deliver a message without the prior express consent of the called party” unless an exemption applies. Recognizing that AI could be used to protect against or make illegal calls, the FCC opened an inquiry in November 2023 on how AI technologies impact the TCPA regulatory regime. The FCC did not propose specific rules, but rather asked for general comment on the benefits and risks associated with AI technologies, including voice cloning, in the TCPA space.

Numerous parties filed comments in response to the FCC’s inquiry, including on the issue of using AI technologies to simulate a human voice. Commenters urged the FCC to confirm that AI technologies such as voice cloning fall within the TCPA’s existing prohibition on artificial or prerecorded voice messages.

New Hampshire robocalls

As was widely reported in the general press, a few days before the New Hampshire presidential primary, New Hampshire voters received recorded calls telling them not to cast their ballots. The calls used a voice crafted to sound like Biden. In addition to using an imitation of Biden’s voice, the calls also used a spoofed telephone number to falsely suggest the call was sent out by a former chair of the New Hampshire Democratic Party.

The FCC, in coordination with New Hampshire’s attorney general and telecom industry trade groups, traced the calls back to Lingo Telecom and identified the party that allegedly initiated the calls. Following its usual procedures, the FCC issued a cease-and-desist letter directing Lingo to stop carrying illegal traffic and alerted other telecom providers that they could lawfully block Lingo’s voice traffic if Lingo did not effectively mitigate its illegal traffic within 48 hours. Because the spoofed telephone number was sufficient to deem the calls illegal, the FCC did not have to specifically address whether the use of an artificial voice in a call made for political purposes also was illegal, or whether it was permissible under a TCPA exemption.

Declaratory ruling

The FCC typically cannot adopt new rules without going through a notice and comment rulemaking proceeding. The FCC can, however, issue a declaratory ruling to issue guidance and clarify its existing rules. Here, the FCC’s November 2023 AI inquiry was not a notice of proposed rulemaking, so it could not be the basis for the FCC to adopt new rules. Accordingly, while the FCC cited the November AI inquiry and the comments urging the FCC to find that AI-generated voices are “artificial” under the TCPA, the declaratory ruling stated that it was “clarifying” and “confirming” that current TCPA rules apply to AI technologies – it was not formulating new rules. While the declaratory ruling itself does not reference the New Hampshire Biden robocalls, FCC Commissioner Geoffrey Starks discussed the calls in his concurring statement.

As the FCC was merely clarifying and confirming existing rules, the declaratory ruling is effective immediately.

Next steps

Most parties agree with the FCC that the declaratory ruling did not change or expand the TCPA rules, as the use of AI voice cloning is just another means of simulating a human voice. Further, because of the many exemptions under the TCPA, such as exemptions when calls are made for noncommercial purposes, it is not clear that the declaratory ruling will provide a basis to stop bad actors from spoofing voices in the context of a political campaign. Parties should, however, take the FCC’s actions as an indication of its concern about the use of AI to create or send robocalls. Accordingly, any use of artificial voice technology to create telephone messages, or any calls made using a prerecorded human voice, should be reviewed by counsel.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Cooley LLP | Attorney Advertising

Written by:

Cooley LLP
Contact
more
less

Cooley LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide