Earlier this month, in Domen v. Vimeo, Inc.,1 a panel of the U.S. Court of Appeals for the Second Circuit held that a relatively unused subpart of Section 230 of the Communications Decency Act (CDA)—namely, 47 U.S.C. § 230(c)(2)(A)—immunized an online platform (Vimeo) from a lawsuit brought by users who complained that the platform had wrongfully deleted their content and banned them from using the platform. The result of this ruling—termination at the motion-to-dismiss stage of a lawsuit against an online platform based on its decisions, actions or inactions relating to the moderation of third-party content—is fully in line with rulings from legions of courts across the country that have applied Section 230 as a source of very broad immunities for online platforms.2 But the ruling is groundbreaking in one respect: It is the first reported case in nearly 20 years in which an appellate court has held that § 230(c)(2)(A)—which prohibits holding online platforms liable for “any action voluntarily taken in good faith” to block or remove material that the platform “considers to be . . . objectionable”—can and should operate to bar such claims at the threshold pleading stage.3 The ruling is also consistent with statutory and policy arguments advanced in an amicus brief by the Internet Association, which was represented in this case by WilmerHale.
Vimeo operates a website that allows users to upload, view, share and comment on videos. In 2016, James Domen and Church United created a Vimeo account and uploaded videos promoting sexual orientation change efforts (SOCE). Vimeo notified them that this content violated Vimeo’s terms of service and instructed them to remove it. When Domen and Church United failed to do so, Vimeo deleted the content and banned their account. Domen and Church United then sued Vimeo in federal court, alleging that Vimeo’s actions constituted religious discrimination in violation of New York and California law, as well as the U.S. and California constitutions. In 2020, the U.S. District Court for the Southern District of New York granted Vimeo’s motion to dismiss, ruling that Appellants’ claims were independently preempted by distinct provisions of the CDA: § 230(c)(1), which grants immunity to online providers who did not “create” or “develop” the content at issue and are treated in litigation as the “publisher” of that content; and § 230(c)(2)(A), which is quoted above.
On appeal, Domen and Church United argued that § 230(c)(1) does not protect against claims challenging a platform’s decision to remove content or users from its website—an argument that nearly all courts to reach that question have rejected.4 As for § 230(c)(2)(A), they argued that provision did not apply because their videos were not “objectionable” and because Vimeo failed to show that it acted in “good faith,” which Appellants claimed is a factual question that cannot be resolved at the pleadings stage. Additionally, Domen and Church United argued that Vimeo could not be deemed to have acted in good faith because it supposedly had failed to remove similar videos posted by other third parties and had banned their account instead of simply deleting the particular videos that Vimeo had identified as violating its rules.
The Second Circuit affirmed the district court’s dismissal, holding that all of Appellants’ claims are barred by § 230(c)(2)(A) and declining to reach the question of whether § 230(c)(1) also barred the claims. The decision is notable in several significant respects.
First, while other courts have typically relied on § 230(c)(1) in granting pleading-stage dismissals of suits brought against platforms by users who complain that the platform improperly removed their content or shuttered their account, here the Second Circuit instead looked solely to § 230(c)(2), which it held can and should be applied at the pleadings stage. The Court explained, in particular, that “‘Section 230 immunity, like other forms of immunity, is generally accorded effect at the first logical point in the litigation process . . . and immunity is an immunity from suit rather than a mere defense to liability; it is effectively lost if a case is erroneously permitted to go to trial.’” Domen, 2021 WL 922749 at *5 (quoting Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc.5).
Second, the Second Circuit set guideposts to assist trial courts in deciding, at the pleading stage, whether the good faith element of § 230(c)(2) is satisfied. For example, the Court ruled that a platform’s allegedly disparate enforcement of its own content policies does not show a lack of good faith. “Given the massive amount of user-generated content available on interactive platforms, imperfect exercise of content-policing discretion does not, without more, suggest that enforcement of content policies was not done in good faith.” Id. at *4.
Third, the Second Circuit held that platform operators have substantial discretion to decide subjectively what content moderation policies to adopt. As the Court explained, § 230(c)(2) “explicitly provides protection for restricting access to content that providers ‘consider . . . objectionable,’ even if the material would otherwise be constitutionally protected, granting significant subjective discretion.” Id. (quoting 47 U.S.C. § 230(c)(2)).
Finally, the Court held that platform operators have substantial discretion to decide how to enforce their content moderation policies—whether through blocking particular posts or suspending accounts. Although Domen and Church United took issue with Vimeo’s decision to delete their “entire account as opposed to deleting only those videos promoting SOCE,” the Court reasoned that Section 230 “does not require providers to use any particular form” of content moderation, and “nothing within the statute or related case law suggests that [permanent closure of plaintiffs’ account] took Vimeo’s actions outside of the scope of subsection (c)(2) immunity.” Id.
The Second Circuit’s decision is consistent with the overwhelming majority of circuits holding that Section 230 grants online providers broad immunity to decide what third-party content to allow on their websites and how to moderate it—both of which are crucial for providers to host platforms tailored to the needs of their users. In ruling that § 230(c)(2) can and should operate at the pleadings stage to bar suits involving removals of user-provided content or bans of particular users, the Court recognized the need to immunize providers not only from ultimate liability but also from the burdens of litigation, which could otherwise render this immunity meaningless given the massive amounts of content that platforms moderate. Similarly, because uniform application of content policies is almost impossible, the Court recognized that imperfect content moderation cannot, on its own, show a lack of good faith. Ultimately, this decision should make online providers more willing to assert § 230(c)(2) immunity (in addition to § 230(c)(1) immunity) on a motion to dismiss in take-down cases where there is no particularized and non-conclusory allegation of bad faith.
-- F.3d --, 2021 WL 922749 (2d Cir. Mar. 11, 2021).
See, e.g., Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014); Green v. Am. Online (AOL), 318 F.3d 465 (3d Cir. 2003); Zeran v. Am. Online, Inc., 129 F.3d 327 (4th Cir. 1997); Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398 (6th Cir. 2014); Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003); Ben Ezra, Weinstein, & Co., Inc. v. Am. Online Inc., 206 F.3d 980 (10th Cir. 2000); Dowbenko v. Google Inc., 582 F. App’x 801 (11th Cir. 2014).
The only other reported appellate decision to this effect is Green, which was issued in 2003. See 318 F.3d at 472-73.
See, e.g., Fyk v. Facebook, Inc., 808 F. App’x 597 (9th Cir. 2020); Sikhs for Justice, Inc. v. Facebook, Inc., 697 F. App’x 526, 526 (9th Cir. 2017); Riggs v. MySpace, Inc., 444 F. App’x 986 (9th Cir. 2011); Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1103 (9th Cir. 2009); Fed. Agency of News LLC v. Facebook, Inc., 432 F. Supp. 3d 1107 (N.D. Cal. 2020); Murphy v. Twitter, Inc., 60 Cal.Rptr.3d 360 (2021); Johnson v. Twitter, Inc., No. 18CECG00078 (Cal. Superior Ct. June 6, 2018); Taylor v. Twitter, Inc., No. CGC 18-564460 (Cal. Superior Ct. Mar. 8, 2019). The lone outlier is e-ventures Worldwide, LLC v. Google, Inc., 188 F. Supp. 3d 1265 (M.D. Fla. 2016).
591 F.3d 250, 254 (4th Cir. 2009).