The Law on Online Content Moderation and Where It's Headed

Proskauer Rose LLP
Contact

Proskauer Rose LLP

Online platforms that allow users to post content face a constant choice: to remove or to not remove, to police or not to police.

Shakespearean allusions aside, platforms generally want user engagement — to reach as many eyeballs as possible. But some consumers may be driven away by content that is perceived as offensive, dangerous, false or otherwise problematic.

Platforms also have their own styles or perspectives on appropriate content, and their own reputations to safeguard. Thus, decisions about content can also be decisions about competition.

In this article, we explore the legal implications of content moderation policies and practices under the Communications Decency Act, or CDA, and antitrust law. However, given that these issues are relatively new and have not been fully developed in existing cases, we consider these topics through variations on two hypotheticals.

These hypotheticals involve a fictional user named Kelbry Jones and a fictional platform called Gorilla. We first analyze unilateral content moderation decisions, followed by potential concerted conduct across platforms. Today, plaintiffs face significant legal headwinds in challenging content moderation policies, but a renewed focus on the issue may change those odds.

First Hypothetical: Unilateral Content Moderation

Kelbry Jones is a political commentator on the popular social media platform Gorilla. While her accounts are frequently reported for posting extremist content and promoting conspiracy theories, she commands a social media following of hundreds of thousands of users.

Recently, some of her posts were flagged for promoting violence against certain groups and/or for containing misleading information. After multiple such reports on Gorilla, the platform decided to temporarily suspend her account.

Unilateral content moderation enjoys broad protection, but there are calls for change.

Section 230 of the CDA provides online platforms with two broad protections from civil liability. The first applies to user-generated content, and the second to platforms' ability to moderate user-generated content:

  • With a few exceptions, platforms generally are not liable for content posted by their users; and
  • Platforms are not liable for "any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise "

The latter is often called the Good Samaritan exemption. Here, the Good Samaritan exemption would likely shield Gorilla from liability. Section 230(c)(2) considers "the provider's subjective intent" by asking whether "the provider or user considers" the content to be objectionable.

Platforms can generally draw those lines where they see fit, so long as it is done in good faith.[1] Take, for example Domen v. Vimeo.

The U.S. Court of Appeals for the Second Circuit ruled in March that a California-based church cannot sue Vimeo Inc. for deleting its account promoting gay conversion, stating "Vimeo is free to restrict access to material that, in good faith, it finds objectionable" because Section 230 is the governing statute in this case. Last month, the appellate court granted a rehearing request from Church United and its pastor, James Domen.

Not all motivations are necessarily safe, though — and this is where competition concerns can arise. Take Enigma Software Group v. Malwarebytes Inc., for example.[2]

Enigma and Malwarebytes both sell computer security software. They competed without legal dispute until Malwarebytes updated its software in a way that flagged Enigma's programs as "potentially unwanted programs."

Enigma sued for deceptive business practices, tortious interference with business and contractual relations, and false advertising under the Lanham Act.

Malwarebytes moved to dismiss under Section 230(c)(2). It had prevailed on the argument at least twice before.[3] But the U.S. Court of Appeals for the Ninth Circuit, citing the history and purpose of the CDA, agreed with Enigma that Section 230 "does not provide immunity for anti-competitive conduct."[4]

Malwarebytes looked to the U.S. Supreme Court, to no avail. The high court denied certiorari, but with a concurrence from Justice Clarence Thomas expounding his view that Section 230 immunities may have gotten out of hand.

Noting that while he "agree[d] with the Court's decision not to take up this case" — the Supreme Court has never interpreted Section 230 — he argued the court "should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms."

The "modest understanding" of what Section 230 is meant to do, he wrote, "is a far cry from what has prevailed in court."[5]

Though that need not concern Gorilla here, consider a twist to the hypothetical. What if Gorilla also published its own political content on its platform, competing with Jones? If Jones could plausibly allege an anti-competitive intent in suspending her account Jones might be able to make it past a motion to dismiss.

As for Enigma and Malwarebytes, Malwarebytes' new motion to dismiss does not mention Section 230 at all.

The future of Section 230 remains an open question. It has been subject to renewed scrutiny, in part due to recent events such as the Russian-engineered misinformation campaign during the 2020 presidential election and the Jan. 6 storming of the Capitol.

Some argue that platforms have not done enough to remove extremist content, while others' primary concern is that Section 230 gives a few platforms too much power to decide what content to allow.

Former President Donald Trump's recent lawsuits allege that large internet platforms are actually state actors subject to First Amendment restrictions, making application of Section 230 unconstitutional. That may be a stretch. Somewhat more likely to effect change at a future date are the dozen or so pending bipartisan bills that would reform or revise Section 230.

The legislative approaches are varied. Most focus on Section 230(c)(1) immunity rather than the Section 230(c)(2) Good Samaritan exemption.

One that does — the Platform Accountability and Consumer Transparency Act, or PACT Act — would require platforms to have a public content moderation policy and impose quarterly reporting requirements.

Large platforms would be subject to a notice-and-takedown procedure, much like the Digital Millennium Copyright Act's process for copyrighted content. That would not help Jones though. Instead of leaving content moderation to the platform's good faith, it would make removal mandatory upon receipt of notice from another third party.

Under the Good Samaritan exemption, unilateral content moderation is a tough nut to crack. But what if platforms acted together?

Second Hypothetical: Concerted Conduct Across Platforms

Shortly after Gorilla suspended Jones' account, two other platforms did the same. All three cited the same grounds — violations of their community guidelines.

Parallel content moderation meets the new prisoner's dilemma.

Does Jones have a cause of action for antitrust conspiracy against the three platforms?

It is not an impossible hurdle. In the seminal Associated Press v. United States case, the government alleged the Associated Press "had by concerted action set up a system of by- laws which prohibited all AP members from selling news to non-members, and which granted each member powers to block its non-members competitors from membership."[6]

In its 1945 ruling on the case the Supreme Court found that the bylaws had the effect of "block[ing] all newspaper non-members from any opportunity to buy news from AP or any of its publisher members."[7]

While the bylaws in Associated Press were sufficient evidence of conspiracy, proving one between independent platforms has so far been difficult for plaintiffs.

For example, in Parler LLC v. Amazon Web Services Inc,[8] Parler alleged Amazon violated Section 1 of the Sherman Act when it terminated web-hosting services for Parler. Parler claimed the termination was "designed to reduce competition in the microblogging services to the benefit of Twitter."

The U.S. District Court for the Western District of Washington earlier this year denied Parler's request for a temporary restraining order, finding that Parler had "submitted no evidence that [Amazon] and Twitter acted together intentionally — or even at all — in restraint of trade."[9]

Similarly, conservative activists sued Google LLC and others alleging the platforms worked together to intentionally and willfully suppress politically conservative content in "an illegal agreement to refuse to deal with conservative news and media outlets."[10]

Again, the U.S. District Court for the District of Columbia court found that plaintiffs "fail[ed] to show how the Platforms' purportedly parallel actions stem[med] from a conspiracy" rather than from the platforms' independent choices.[11]

Allegations of "parallel conduct alone cannot support a claim under the Sherman Act."[12] To state a claim, the "context," often called "plus factors," must "raise[] a suggestion of a preceding agreement."

Otherwise, the seemingly parallel conduct of the firms involved "could just as well be independent action."[13] Plus factors often include, for example, a motive to conspire.

Consider the motivation question. Platforms compete with each other based on the quality and nature of the content they make available. Content that may be viewed as undesirable by some may be in high demand by others.

This competition might actually benefit content that is popular but otherwise perceived as undesirable, since platforms may be less likely to restrict content that is popular with users.

What ensues is not the pre-CDA moderator's dilemma (i.e., risk liability for leaving content up, or risk liability for taking it down), but a new prisoner's dilemma (i.e., if I don't say yes, will someone else do so at my expense?).

Let's say Jones' theoretical content was more overtly pernicious, i.e, it promoted terrorist activities. But it was popular, garnering millions of views and traffic to Gorilla.

If Gorilla suspends Jones' account, will a competing service welcome her content, and her followers, with open arms? In the absence of some coordination, the incentive may be to permit the content — at least as long as Section 230 immunity is there.

Still, a plausible allegation of motive to conspire may not in itself be enough. Major content moderation decisions are public affairs, particularly on the large platforms.

That visibility provides platforms the luxury of taking a wait-and-see approach, or follow the leader. Additionally, reduction in viewpoints may not actually be an anti-competitive reduction in output, as, arguably, platforms do not sell viewpoints and users do not actually purchase anything.

Where does that leave us? It leaves plaintiffs attorneys looking for clients with more than parallel action — with enough plus factors such as motive and opportunity to conspire, acts contrary to self-interest and the like.

And unless the courts start severely narrowing Section 230 immunity despite its now-long pedigree, it leaves much of these issues up to Congress.

Reproduced with permission. Originally published August 10, 2021, "The Law On Online Content Moderation And Where It's Headed," Law360.

_______________________

[1] Domen v. Vimeo Inc. , 433 F. Supp. 3d 592, 603-604 (S.D.N.Y. 2020) (holding 230(c)(1) and (2) both provided immunity for claims arising from video hosting provider's decision to remove content that violated its policy against content that promotes "[s]exual [o]rientation [c]hange [e]fforts."), aff'd Domen v. Vimeo, Inc.,No. 20-616 ECF No. 133 (2d Cir. July 21, 2021). https://www.law360.com/articles/1403636/court-to-probe-vimeo-win- in-gay-conversion-censorship-suit

[2] 946 F.3d 1040 (9th Cir. 2019).

[3] PC Drivers Headquarters, LP v. Malwarebytes Inc. , 371 F. Supp. 3d 652 (N.D. Cal. 2019; PC Drivers Headquarters, LP v. Malwarebytes, Inc., 2018 WL 2996897, at *1 (W.D. Tex 2018).

[4] 946 F.3d 1040, 1050.

[5] Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC , 141 S.Ct. 13, 14–15 (2020).

[6] Associated Press v. United States , 326 U.S. 1, 4 (1945).

[7] id. at 9.

[8] 2021 WL 210721 (W.D. Wash. Jan. 21, 2021).

[9] id. at *3.

[10] Freedom Watch v. Google , 368 F. Supp. 3d 30, 34 (D.D.C. 2019), aff'd, 816 F. App'x 497 (D.C. Cir. 2020), 2021 WL 210721 (W.D. Wash. Jan. 21, 2021)

[11] id. at 37.

[12] Freedom Watch, Inc. v. Google Inc., 816 F. App'x 497, 500 (D.C. Cir. 2020),cert. denied,No. 20-969, 2021 WL 1240927 (U.S. Apr. 5, 2021).

[13] Bell Atl. Corp. v. Twombly , 550 U.S. 544, 557 (2007).

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Proskauer Rose LLP | Attorney Advertising

Written by:

Proskauer Rose LLP
Contact
more
less

Proskauer Rose LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide