One to Watch: Has the Ninth Circuit Turned on Section 230?

Pillsbury Winthrop Shaw Pittman LLP
Contact

Pillsbury Winthrop Shaw Pittman LLP

TAKEAWAYS

  • Ninth Circuit precedent protects information service providers from liability arising from user-generated content, including when classifying user characteristics as part of platform design or providing neutral tools that users later use to engage in “unlawful or illicit conduct.”
  • In Vargas, the court concluded that defendant Facebook Inc. could be responsible as a co-developer of housing ads wrongfully targeted by advertisers based on protected characteristics of users identified by Facebook’s algorithm.
  • In denying the platform’s motion to dismiss, the Vargas court has introduced meaningful ambiguity in the Ninth Circuit’s Section 230 jurisprudence with respect to platform responsibility for users’ online content, which could chill innovative use of algorithms and frustrate the purpose of Section 230 immunity altogether.

On June 21, 2023, the Ninth Circuit decided in a class action suit, Vargas, et al., v. Facebook, Inc., that Section 230 of the Communications Decency Act (Section 230) did not immunize Facebook from claims arising from allegedly discriminatory conduct by housing advertisers using the defendant’s Ad Platform. Ad Platform provides advertising users with the ability to select from among thousands of user attributes, including protected characteristics like sex, disability and familial status (e.g., whether a person has children), to target ads to advertisers’ preferred audiences on Facebook. Facebook is not alleged either to have contributed actual content to the housing ads or to have directed, induced or required advertisers to select particular audience attributes, whether or not protected under federal housing law. Nevertheless, the Ninth Circuit denied Facebook’s motion to dismiss, holding that Section 230 immunity did not apply.

In so holding, the Vargas court determined that the underlying complaint sufficiently alleged that the defendant was not merely the publisher of the ads, but also a “co-developer of [the] content.” However, the court’s decision created two significant ambiguities in the Ninth Circuit’s Section 230 jurisprudence: First, the court concluded that Facebook had moved beyond “passive transmitter of information” and had “contribute[d] materially” to the alleged illegal conduct simply by including within Ad Platform user attributes Facebook had ascertained using algorithmic detection that were associated with protected characteristics (e.g., “parenting”). Second, the court held that the “neutral tools” doctrine did not extend to Ad Platform because Facebook had “promote[d] the effectiveness of its advertising tools specifically to housing advertisers.”

The Vargas decision seems at odds with long-standing Ninth Circuit precedent providing broad immunity against claims related to publishing content provided primarily by third parties. The decision thus raises the question whether this is simply a results-oriented ruling by a single three-judge panel that will not survive en banc review or whether this marks a turning point in the court’s Section 230 jurisprudence.

Should the decision stand, it would seem to create a path for future plaintiffs to plead around Section 230 immunity, materially weakening the law’s protections for hosts of user-generated content.

Platform Design Has Not Traditionally Created Liability for Wrongful User Content
In Barnes v. Yahoo!, Inc., the Ninth Circuit established its seminal three-part test for Section 230 immunity: (1) Is the defendant “a provider or user of an interactive computer service;” (2) Does the underlying cause of action seek to treat the defendant “as a publisher or speaker” of the allegedly violating content; and (3) Was the content “provided by another information content provider?” If the answer to each of these three questions is “yes,” then a defendant is immune from suit under Section 230; a single “no” and Section 230 offers no protection.

The Ninth Circuit has traditionally extended Section 230 immunity to platforms that are not primarily responsible for wrongful content created by their users, even when the platform design is implicated in the alleged wrongdoing. In Dyroff v. Ultimate Software Grp., Inc. (2019), a case cited in Vargas, the defendant’s platform algorithm recommended a user connect with a drug dealer and notified the dealer of the recommended connection. The user and dealer then used the defendant’s website to communicate with each other and arrange a drug transaction. The user later died when the drugs he purchased from the dealer were laced with fentanyl. In finding the platform immune from suit under Section 230, the court explained that while the recommendation and notification features may have facilitated the connection and enabling communications, the platform “did not materially contribute ... to the alleged unlawfulness of the content.” In other words, incorporating an algorithm that helped facilitate an illegal drug transaction did not make the platform responsible for the illegal and harm-causing conduct engaged in by the users.

Another well-settled Ninth Circuit case addresses the extent to which an algorithm designed into a platform impacts the platform’s liability for online content. In Carafano v. Metrosplash.com, Inc. (2003), the court held that just because a platform “classifies user characteristics ... does not transform [it] into a ‘developer’ of the ‘underlying information.’” In that case, the developer of Matchmaker.com was sued when someone posing as a celebrity created a fake (and offensive) dating profile, leading to significant real-world harassment of the plaintiff. As part of its website design, Matchmaker.com classified user characteristics into categories and used those classifications to match users. Although the trial court denied Section 230 protection to the creators of Matchmaker.com, the Ninth Circuit overruled, holding that “so long as a third party willingly provides the essential published content, the interactive service provider receives full immunity regardless of the specific editing or selection process.” As recently as May 2023, the Ninth Circuit panel in Doe v. Twitter, Inc. cited Carafano, explaining that “section 230 ‘provides broad immunity’ for claims against interactive computer service providers ‘for publishing content provided primarily by third parties.’”

However, interactive computer service providers may be found liable for wrongful user content where the platform design materially contributes to the creation of the wrongful content. In Fair Housing Council of San Fernando Valley v. Roommates.com, LLC (2008), the Ninth Circuit held that a platform could be found responsible for the development of content if it “contribute[d] materially to the alleged illegality of the conduct.” In Roommates, the defendant’s platform “require[ed] subscribers to provide [protected] information as a condition of accessing its service,” which users “[could] not refuse to answer if they wanted to use defendant’s services.” Along with the type of housing they were seeking, users of Roommates.com were compelled to reveal their gender, sexual orientation and familial status, and Roommates.com “provid[ed] a limited set of pre-populated answers” from which users could choose. The answers to these mandatory questions then became the actual content of the users’ online profile and were used by other users to identify candidates for housing.

The Roommates court held that such a platform design crossed from “passive transmitter” into “developer ... in part” and denied Section 230 immunity to the platform. Because the design of Roommates.com both dictated the information that would appear online about a user and required users to provide the information, the court held that the defendant in Roommates was responsible “in part” for the violating content on the website. Consequently, Roommates.com failed the third prong of the Barnes test and was not immune from suit under Section 230.

Nevertheless, the Ninth Circuit in Roommates also explained, “in an abundance of caution,” the limits of platform design crossing into content creation. The court, citing Barnes, stated, “providing neutral tools to carry out what may be unlawful or illicit ... does not amount to ‘development’ for [Section 230] purposes.” The court went on to explain, as a specific example, “a housing website that allows users to specify whether they will or will not receive emails by means of user-defined criteria ... would be immune [under Section 230], so long as it does not require the use of discriminatory criteria.” Key to drawing the line in this cautionary example, as the court emphasized, was the agency of users in selecting the offending criteria as opposed to the platform dictating or requiring its use.

Ultimately, Ninth Circuit jurisprudence seemed squarely set that a platform is not responsible as a developer of content where alleged wrongful content was primarily provided by a platform user or resulted from non-mandatory, user-defined criteria. After Vargas, the guidance is less clear.

Does Vargas Signal a New Direction for the Ninth Circuit?
The Vargas case relates to defendant Facebook Inc.’s advertising tool, Ad Platform, which allowed advertising users to target their ads to specific audiences. According to the plaintiffs’ complaint, Ad Platform provided “hundreds of thousands” of individual user attributes advertisers could select to target their ads, including protected characteristics like sex, disability and familial status. Most of the attributes used to target ads were not based on information expressly provided by individual users about themselves, but rather were determined by the platform’s algorithm. (The complaint alleged that the attributes available on Ad Platform were as varied as ‘women in the workforce,’ ‘parenting,’ ‘hijab fashion,’ ‘Hispanic culture’ and hundreds of thousands of others.) The specific Ad Platform targeting criteria for a given ad were selected by the advertising users and had no impact on the actual content of ads. Advertisers were also not required to select any specific attributes, or any attributes at all. Also, the available attributes were the same whether advertising housing, clothing, make-up or video games.

The Vargas class action against Facebook alleged housing discrimination based on housing advertisers’ use of Ad Platform to direct ads illegally based on users’ protected characteristics. The trial court denied Facebook’s motion to dismiss, which the defendant appealed, relying on some of the precedent described above. Nevertheless, citing Roommates, the Ninth Circuit upheld the trial court’s decision, finding that the design of Ad Platform “contribute[d] materially to the alleged illegality of the [advertisers’] conduct” and thus rejecting Facebook’s claim to Section 230 immunity. The court determined that the defendant’s creation of categories on Ad Platform, use of its own algorithms to assign attributes to individual users, and offering of simple drop-down menus and toggle buttons that allowed advertisers to exclude people based on protected characteristics were sufficient to support the plaintiff’s claims that the platform was responsible as a co-developer of content and not merely as a publisher.

In reaching its decision, the Vargas court rejected the defendant’s arguments distinguishing the damning facts in Roommates and instead emphasized that the plaintiffs’ allegations challenged “Facebook’s own actions.” The court concluded that Facebook was “more of a developer than the website in Roommates.com in one respect,” because the defendant’s algorithms ascertained characteristics of individual users based on the user’s online activities, even if the user did not intend to reveal such characteristics. The court did not explain, however, how use of algorithm-determined attributes to distribute ads crossed the line into content development.

The Vargas court also rejected the defendant’s argument that Ad Platform did not require that advertisers target ads based on protected characteristics, as the design of Roommates.com required. The court dismissed the argument as “weak,” because Ad Platform “directly and easily allowed advertisers to exclude all persons of a protected category (or several protected categories).” The court also rejected the argument that Ad Platform was a neutral tool, because the defendant specifically promoted its effectiveness to advertise housing.

The Vargas opinion, should it survive en banc review, would seemingly mark a major turning point in the Ninth Circuit’s Section 230 jurisprudence. The court’s conclusions seem quite at odds with the guidance provided out of “an abundance of caution” in Roommates itself that “providing neutral tools to carry out what may be unlawful or illicit searches does not amount to ‘development’ for [Section 230] purposes.” In Vargas, there was no allegation that the defendant encouraged housing advertisers to target ads based on protected characteristics or otherwise contributed to the advertisers’ selections in any way. Rather, Ad Platform made the same attributes available to all of its advertising users and Ad Platform left targeting selections up to those users entirely—contentions unchallenged by the Vargas court. Thus, Ad Platform seemed to be precisely the “neutral tool” the Ninth Circuit explained would not run afoul of Section 230. The court also seemed to acknowledge that it was advertising users, not the defendant, who “carr[ied] out what may be unlawful or illicit” conduct in selecting protected characteristics to use as criteria for targeting housing ads. Nevertheless, the court’s conclusion seems to hold Facebook accountable as a co-developer because it encouraged housing advertisers to use Ad Platform and designed an algorithm that made it possible for those advertisers to discriminate.

Conclusion
The Vargas court’s conclusion creates two significant ambiguities in the Ninth Circuit’s Section 230 jurisprudence: first, whether a platform moves beyond a “passive transmitter of information” and becomes a material contributor to the development of content because its platform classifies users based on characteristics developed by the platform’s algorithm; and, second, whether the “neutral tools” doctrine does not extend to protect platforms from users’ “unlawful or illicit conduct” if the platform encourages use of the tool (though not unlawful or illicit use) by users in regulated industries.

Should the Vargas ruling stand, it may create a significant hole through which future plaintiffs can penetrate Section 230 immunity. Alleging an injury arising from information ascertained about an individual user by way of an algorithm would seemingly make the platform a co-developer of the content, unprotected by Section 230. Similarly, a platform feature that could be provided without reservation to one type of user could expose a platform to litigation risk if provided to another, regulated user. Such an outcome might not only chill online platforms from innovatively using algorithms to provide a better online experience for users, but it might also frustrate a foundational principle of Section 230, that a user’s misuse or abuse of an information service provider’s platform to engage in illegal conduct does not make the information service provider legally responsible for the wrongdoing.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Pillsbury Winthrop Shaw Pittman LLP | Attorney Advertising

Written by:

Pillsbury Winthrop Shaw Pittman LLP
Contact
more
less

Pillsbury Winthrop Shaw Pittman LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide