While we are still in the infancy of the Biden Administration, it is clear that bipartisan desire to amend Section 230 of the Communications Decency Act (“Section 230”) remains active.
On February 8, 2021, Sen. Mark Warner (D-VA) introduced the Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act – or “SAFE TECH Act.” The bill, co-sponsored by Sens. Mazie Hirono (D-HI) and Amy Klobuchar (D-MN), seeks to chip away at the protection online platforms and websites have enjoyed under Section 230. Although President Trump thrust Section 230 into the spotlight last year – issuing an Executive Order that took aim at social media platforms in May and vetoing the 2021 National Defense Authorization Act for failing to repeal Section 230 in December – voices on the left have long been frustrated by Section 230 protections for online platforms as well.
Section 230, most notably, immunizes the owner or user of an “interactive computer service” against liability for third party-posted content – including content that may allegedly violate various privacy laws. In practice, this means that a social media website may not be held liable for the content that its users post on the site. While Section 230 was initially passed in hopes of incentivizing these interactive computer service providers to actively screen for and filter out objectionable content posted by their users, critics of the law have long argued that Section 230’s broad immunity goes too far. The SAFE TECH Act seeks to curb this broad immunity by making the following three key amendments to Section 230:
- Limiting the Scope of Section 230 Immunity
The SAFE TECH Act would narrow the breadth of protections afforded to online platforms in two regards. First, the SAFE TECH Act would specify that an online platform may only be shielded from liability for “any speech” posted by a third party, rather than “any information” as Section 230 currently provides. Second, the SAFE TECH Act would provide that Section 230 immunity does not apply to ads or other paid content – that is, to providers or users that have “accepted payment to make the speech available or, in whole or in part, created or funded the creation of the speech.”
Further, the SAFE TECH Act would establish five additional exceptions to Section 230’s applicability. Section 230(e) currently allows online platforms to be held liable under federal criminal law, intellectual property law, state law that is consistent with Section 230, the Electronic Communications Privacy Act of 1986 or any similar state law, and state or federal sex trafficking law. The SAFE TECH Act would expand that list by providing that “[n]othing in [Section 230] shall be construed to limit, impair, or prevent” any actions arising under (a) federal or state civil rights laws; (b) federal or state antitrust laws; (c) federal or state stalking, harassment, or intimidation laws; (d) international human rights law; or (e) civil wrongful death actions.
2. Making Section 230 an Affirmative Defense
The SAFE TECH Act would also make Section 230 an affirmative defense, by establishing that any defendant raising its Section 230 immunity in response to a suit would “have the burden of persuasion, by a preponderance of the evidence, that the defendant is a provider or user of an interactive computer service and is being treated as the publisher or speaker of speech provided by another information content provider.” While current interpretations of Section 230’s immunity often lead courts to dismiss suits against online platforms at the outset of litigation this amendment would lead to further litigation of the facts underlying plaintiffs’ claims.
3. Narrowing Section 230’s ‘Good Samaritan’ Protection
Section 230 currently allows online platforms to remove or moderate third-party content posted to their websites that they deem objectionable, without fear of facing liability for doing so. The SAFE TECH Act, however, would narrow the breadth of this “Good Samaritan” defense by allowing plaintiffs to seek injunctive relief “arising from the failure of an interactive computer service provider to remove, restrict access to or availability of, or prevent dissemination of material that is likely to cause irreparable harm.” However, the bill also clarifies that a provider’s compliance with such a request for injunctive relief will not subject the provider to any liability.
While it is unclear when Section 230 reform will take center stage on Congress’ agenda, the pressure to amend Section 230, on both sides of the aisle, is clearly rising. As such, it is vital that organizations that would ordinarily rely on Section 230’s protections monitor the status of the SAFE TECH Act, or any other proposed amendments to Section 230 that may be introduced, going forward.