Judge Denies Motion to Dismiss AI Defamation Suit

Ballard Spahr LLP
Contact

Ballard Spahr LLP

Summary

Gwinnett County, Georgia, Superior Court Judge Tracie Cason denied OpenAI’s motion to dismiss a defamation lawsuit. Companies should keep in mind that AI “hallucinations” may present liability for generative AI.

In June of 2023, radio host Mark Walters filed a defamation lawsuit against OpenAI in Gwinnett County Superior Court in Georgia, alleging ChatGPT produced a response suggesting that Mr. Walters was embezzling from the Second Amendment Foundation (SAF), a gun rights advocacy group. In the complaint, Mr. Walters alleged that a ChatGPT user submitted a prompt to ChatGPT as part of research into ongoing litigation involving the SAF and in response ChatGPT hallucinated defamatory material falsely implicating Mr. Walters in an embezzlement scheme. Specifically, the ChatGPT user asked ChatGPT to provide a summary of the SAF complaint. In response, ChatGPT produced factually inaccurate text asserting that the SAF complaint is “a legal complaint filed…against Mark Walters, who is accused of defrauding and embezzling funds from the SAF…The complaint alleges that Walters….misappropriated funds for personal expenses and without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures to the SAF’s leadership.” When asked, ChatGPT provided the complete text of the completely fabricated legal complaint.

In fact, Mr. Walters is not a party to the SAF litigation, has never held any positions at the SAF, and is not accused of defrauding or embezzling funds from SAF. The fabricated complaint and summary thereof is an example of a “hallucination,” when a generative AI program makes up fake facts. These hallucinations occur because ChatGPT does not function like a search engine, but rather uses natural language processing and other techniques to predict text the user would like to see based on the prompt.

Mr. Walters alleges that by generating the fabricated complaint, OpenAI published libelous matter. In January of this year, a Gwinnett County Superior Court Judge denied OpenAI’s motion to dismiss the defamation lawsuit, allowing the case to proceed. In its motion to dismiss, OpenAI argued that it was not liable for defamation because the user who prompted ChatGPT knew the hallucinations were false. OpenAI also asserted there should be no liability because the ChatGPT terms of use alerts users that ChatGPT “is not fully reliable (it ‘hallucinates’ facts and makes reasoning errors)…[and] care should be taken when using language model outputs, particularly in high-stakes contexts.”

Mr. Walters’ lawsuit is likely one of many cases testing where liability falls when generative AI creates false information that may be damaging. Many internet-based companies that publish content seek protection from defamation and libel suits under Section 230 of the Communication Decency Act. That 1996 Federal law protects internet platforms from liability based on content created by their users. It has yet to be determined, however, whether the law extends that protection to generative AI.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Ballard Spahr LLP | Attorney Advertising

Written by:

Ballard Spahr LLP
Contact
more
less

Ballard Spahr LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide