Federal Judges Revise Court Rules to Require Certification Regarding the Use of A.I.

Seyfarth Shaw LLP
Contact

[co-author: Danny Riley]

Seyfarth Synopsis: Federal judges are requiring attorneys to attest as to whether they have used generative artificial intelligence (AI) in court filings, and if so, how and in what manner it was used. These court orders come just days after two New York attorneys filed a motion in which ChatGPT provided citations to non-existent caselaw.[i]

There are many ways to leverage AI tools across the legal industry, including identifying issues in clients’ data management practices, efficiently reviewing immense quantities of electronically stored information, and guiding case strategy, but according to U.S. District Judge Brantley Starr of the Northern District of Texas, “legal briefing is not one of them.” Last Tuesday, May 30, Judge Starr became the first judge requiring all attorneys before his court to certify whether they used generative AI to prepare filings, and if so, to confirm any such language prepared by the generative AI was validated by a human for accuracy.[ii]

Judge Starr reasoned that:

These platforms in their current states are prone to hallucinations and bias. On hallucinations, they make stuff up—even quotes and citations. Another issue is reliability or bias. While attorneys swear an oath to set aside their personal prejudices, biases, and beliefs to faithfully uphold the law and represent their clients, generative artificial intelligence is the product of programming devised by humans who did not have to swear such an oath. As such, these systems hold no allegiance to any client, the rule of law, or the laws and Constitution of the United States (or, as addressed above, the truth). Unbound by any sense of duty, honor, or justice, such programs act according to computer code rather than conviction, based on programming rather than principle. Any party believing a platform has the requisite accuracy and reliability for legal briefing may move for leave and explain why.[iii]

Critically, the failure to submit such a certification would result in the court striking the filing and potentially imposing sanctions under Rule 11.

Accordingly, the Court will strike any filing from a party who fails to file a certificate on the docket attesting that they have read the Court’s judge-specific requirements and understand that they will be held responsible under Rule 11 for the contents of any filing that they sign and submit to the Court, regardless of whether generative artificial intelligence drafted any portion of that filing. A template Certificate Regarding Judge-Specific Requirements is provided here.[iv]

Shortly thereafter, on June 2, Magistrate Judge Gabriel Fuentes of the Northern District of Illinois followed suit with a revised standing order that not only requires all parties to disclose whether they used generative AI to draft filings, but also to disclose whether they used generative AI to conduct legal research. Judge Fuentes deemed the overreliance on AI tools a threat to the mission of federal courts, and stated that “[p]arties should not assume that mere reliance on an AI tool will be presumed to constitute reasonable inquiry.”[v]

Mirroring the reasoning of Judge Starr, Judge Fuentes further highlights courts’ longstanding presumption that Rule 11 certifications are representations “by filers, as living, breathing, thinking human beings that they themselves have read and analyzed all cited authorities to ensure that such authorities actually exist and that the filings comply with Rule 11(b)(2).”[vi] Both Judges have made clear that in order to properly represent a party, attorneys must always be diligent in that representation and that reliance on emerging technology, as convincing and tempting as it may be, requires validation and human involvement.

While federal courts in Texas and Illinois were first to the punch, we don’t expect other jurisdictions to be far behind with court orders mirroring those of Judge Starr and Judge Fuentes.


[i] See Mata v. Avianca, Inc., No. 22-cv-1461 (PKC), Order to Show Cause (S.D.N.Y. May 4, 2023); see also Use of ChatGPT in Federal Litigation Holds Lessons for Lawyers and Non-Lawyers Everywhere, https://www.seyfarth.com/news-insights/use-of-chatgpt-in-federal-litigation-holds-lessons-for-lawyers-and-non-lawyers-everywhere.html.

[ii] Id.

[iii] Hon. Brantley Starr, “Mandatory Certification Regarding Generative Artificial Intelligence [Standing Order],” (N.D. Tex.).

[iv] Id.

[v] Hon. Gabriel A. Fuentes, “Standing Order For Civil Cases Before Magistrate Judge Fuentes,” [N.D. Ill.]

[vi] Id.



DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Seyfarth Shaw LLP | Attorney Advertising

Written by:

Seyfarth Shaw LLP
Contact
more
less

Seyfarth Shaw LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide