Is Everything Better in Moderation? Circuit Split on Content Moderation to Be Heard by SCOTUS

Vinson & Elkins LLP
Contact

Vinson & Elkins LLP

On February 26, 2024, the United States Supreme Court is set to hear oral argument in two cases currently before the Court, Moody v. NetChoice1 and NetChoice v. Paxton.2 At their core, these cases raise the question as to whether the First Amendment prohibits state laws restricting “social media platforms” from engaging in content moderation and making editorial choices about whether and how to publish speech on their platforms. In addition to resolving a split between the Fifth and Eleventh Circuits on the issue, the Supreme Court’s decision could impact content moderation and regulation beyond social media sites, including on generative artificial intelligence (“AI”) platforms.

The Decisions to Be Reviewed by the Supreme Court

The NetChoice cases arose from state laws in Florida and Texas that restrict social media platforms’ ability to moderate content by restricting certain posts or users. The Texas statute, House Bill 20, prohibits large social media platforms from censoring a user, a user’s expression, or a user’s ability to receive the expression of another person based on the speaker’s viewpoint, the viewpoint represented by the speaker’s expression, or the user’s geographic location in the state.3 The Texas statute still allows for censorship of unprotected speech, such as speech directly inciting criminal activity, threats of violence, and other unlawful expression.4 The Florida statute, S.B. 7072, on the other hand, only prohibits censorship of some speakers — specifically, political candidates and journalistic enterprises — but broadly prohibits any censorship of those speakers through deplatforming, using post-prioritization, or shadow banning5 those speakers, the only exception being for “obscene” content by a journalistic enterprise.6

Both laws also impose disclosure obligations and operational requirements on the platforms related to their content moderation. Under the Texas statute, platforms must (1) disclose how they moderate and promote content, (2) publish an acceptable use policy and a biannual report of statistics related to content moderation, and (3) maintain a complaint-and-appeal system for their users.7 Under the Florida statute, platforms must (1) publish their standards for censoring, deplatforming, and shadow banning, (2) inform users about changes to their rules before implementation, and (3) provide users with detailed notice before censoring, deplatforming, or shadow-banning the user.8

Social media platforms sought to enjoin both laws. The Eleventh Circuit upheld much of the district court’s preliminary injunction of the Florida law, concluding that content moderation is a form of speech protected by the First Amendment.9 Applying First Amendment scrutiny, the Eleventh Circuit found that most of Florida’s law is not substantially likely to survive even intermediate scrutiny and, thus, is substantially likely to violate the First Amendment.10 By contrast, the Fifth Circuit upheld the Texas law and vacated the district court’s preliminary injunction, finding that content moderation is not protected speech under the First Amendment but rather is itself “censorship” that states may regulate.11

Both the Eleventh and the Fifth Circuits found, with one exception, that the disclosure and operational requirements imposed by the laws were not unconstitutional.12 The Eleventh Circuit held the Florida statute’s requirement that platforms provide notice and detailed justification for every content moderation action to be unduly burdensome and likely to chill platforms’ protected speech.13

What Will the Supreme Court’s Decision Mean for You?

The Supreme Court’s resolution of this circuit split could have significant impacts on a wide range of Internet platforms beyond the customary “social media” sites like Facebook, X (formerly Twitter), and YouTube.

For instance, AI platforms use content moderation mechanisms to prevent harmful content from being generated on their sites and to mitigate liability resulting from harmful outputs. If the Supreme Court upholds these state laws, the content moderation mechanisms employed by AI platforms could be limited by these or similar state laws if applied to AI platforms. The determining factor is whether an AI platform, or any other Internet platform engaging in content moderation, would qualify as a “social media platform” under these state laws or whether similar laws could be enacted expanding applicability to other Internet platforms.

Both the Texas and Florida laws apply to “social media platforms.” Under Texas’s law, the definition of “social media platforms” includes any “Internet website or application that is open to the public, allows a user to create an account, and enables users to communicate with other users for the primary purpose of posting information, comments, messages, or images” and “functionally” has more than 50 million active users in the U.S. each month.14 The definition excludes “an online service, application, or website . . . that consists primarily of . . . content that is not user generated but is preselected by the provider.”15 By contrast, Florida’s law broadly defines “social media platforms” as “any information service, system, Internet search engine, or access software provider that . . . [p]rovides or enables computer access by multiple users to a computer server, including an Internet platform or a social media site,” and that meets certain revenue or participant thresholds.16 These definitions may be broad enough to sweep in generative AI platforms, such as OpenAI’s ChatGPT. In fact, the Eleventh Circuit surmised that the Florida statute’s definition of what a “social media platform” does could even sweep in websites like crowdsourced reference tool Wikipedia and virtual handmade craft-market Etsy.17

Even if not considered “social media platforms” under these or similar state laws, if the Supreme Court upholds these statutory restrictions on content moderation, AI platforms may be constrained in guarding against disinformation or other harmful content generated in response to user prompts.

Companies with Internet platforms that engage in content moderation should consider whether they may fall within the application of the Texas, Florida, or similar state laws regulating content moderation. If the Supreme Court upholds the state laws, these platforms should work with counsel to develop adequate policies and procedures for content moderation and ensure compliance with disclosure and operational requirements under the laws. To the extent content moderation is limited under these laws, platforms also should consider how they may continue mechanisms for mitigating and eliminating harmful content on their sites while maintaining compliance with the laws.

1NetChoice, LLC v. Att’y Gen., Fla., 34 F.4th 1196 (11th Cir. 2022), cert. granted in part sub nom. Moody v. NetChoice, LLC, 216 L. Ed. 2d 1313 (2023), and cert. denied sub nom. NetChoice, LLC v. Moody, 144 S. Ct. 69, 217 L. Ed. 2d 9 (2023).
2NetChoice, L.L.C. v. Paxton, 49 F.4th 439 (5th Cir. 2022), cert. granted in part sub nom. NetChoice, LLC v. Paxton, 216 L. Ed. 2d 1313 (2023).
3Tex. Civ. Prac. & Rem. Code Ann. § 143A.002.
4Id. § 143A.006.
5“Shadow banning” refers to a site limiting exposure of a user or their content.
6Fla. Stat. Ann. § 501.2041 (West).
7Tex. Bus. & Com. Code §§ 120.051–.053, 120.101–.104.
8Fla. Stat. § 501.2041.
9Moody, 34 F.4th at 1223.
10Id. at 1227.
11See Paxton, 49 F.4th at 455.
12Moody, 34 F.4th at 1230; Paxton, 49 F.4th at 488.
13Moody, 34 F.4th at 1230.
14Tex. Bus. & Com. Code §§ 120.001–.002.
15Id. § 120.001.
16Fla. Stat. § 501.2041(1)(g).
17Moody, 34 F.4th at 1205.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Vinson & Elkins LLP | Attorney Advertising

Written by:

Vinson & Elkins LLP
Contact
more
less

Vinson & Elkins LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide