Lawyers and ChatGPT — Averting a Possible Disaster

Gray Reed
Contact

Gray Reed

[co-author: Emily Morris]*

Last month Tilting blogged about Peter LoDuca, Steven A. Schwartz and their New York law firm who New York Federal Judge Kevin Castel chastised for submitting non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT. Worse yet, they continued to stand by the fake opinions after Judge Castel challenged their brief. Though LoDuca and Schwartz claimed they were “unaware of the possibility that [ChatGPT’s] content could be false,” Judge P. Kevin Castel ordered them to appear earlier this month to evaluate whether sanctions were warranted.

Update from the Hearing

During the two hour June 8th hearing, Judge Castel pressed Schwartz to explain why his brief was filled with fake judicial opinions and legal citations. Remorsefully stating that he “did not comprehend that ChatGPT could fabricate cases,” and that he wished he had conducted more thorough research into the information, Schwartz stated that he assumed the “new site . . . was, like, a super search engine” capable of pulling research that he was somehow unable to access. Judge Castel was not impressed. When asked “[c]an we agree that is legal gibberish?” Schwartz sheepishly agreed and acknowledged he did nothing to confirm the cited cases actually existed.

Though Mr. Schwartz’s attorney argued that no intentional misconduct was committed, one wonders how such an instance of attorney negligence warrants a lesser punishment than other failures of professional legal conduct.

Although concluding that LoDuca and Schwartz acted with subjective bad faith, Judge Castel found that their law firm arranged to conduct mandatory ethical training on technological competence and artificial intelligence programs, and the Court noted and weighed the significant negative publicity generated by their actions, together with their sincerity in describing their embarrassment and remorse. The Court concluded that a penalty of $5,000 paid into the registry of the Court was sufficient.

Déjà Vu in Colorado

LoDuca and Schwartz’s public misfortune may have rescued another attorney from a similar fate. A new Colorado Springs attorney Zachariah Crabill fell prey to fraudulent ChatGPT case citations. Crabill accidentally cited to numerous cases fabricated by—you guessed it—ChatGPT. Luckily, Crabill realized his mistake on the day of the hearing, though the judge threatened to file a complaint against him. Crabill apologized for the mishap in a May affidavit, stating that “[b]ased on the accuracy of prior validated responses, and the apparent accuracy of the case law citations, it never even dawned on [him] that this technology could be deceptive.”

Tilting the Scales in Your Favor

While AI in the workforce may have a bright future, current users beware! Regardless of ever-changing technology, a lawyer owes a duty of candor and accountability to the tribunal. Legal professionals are duty bound to ensure that they make no material misrepresentations of fact or law. An inability to locate an AI-generated case on Lexis, Westlaw, Fastcase or even in a physical copy of the Federal Reporter should raise plenty of red flags, and the intentional choice to proceed with that knowledge falls far short of a lawyer’s duty to the court.

*Rising 2-L at the University of Texas Law School and a Gray Reed summer associate

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Gray Reed | Attorney Advertising

Written by:

Gray Reed
Contact
more
less

Gray Reed on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide