Intermediary liability frameworks for digital platforms in Latin America still a patchwork

White & Case LLP Brazil leads the way with regulation, Argentina and Colombia follow court-established principles, while Mexico lags behind.

Digital platforms and other information intermediaries, such as Facebook, Instagram, Snapchat, TikTok and Twitter have shaped and transformed the way we communicate, connect, and do business. Just before the turn of the 21st century, digital platforms began offering a new space for people to interact and widely share their opinions and ideas – expanding freedom of expression rights globally and, according to a UNESCO report, in Latin America in particular. But while the United States and the European Union (EU) continue to introduce and reform legislation to keep up with the technology, governments across Latin America are largely yet to pass similar legislation. Such inaction creates uncertainty in the region, both for companies that provide digital platforms and their users.

Legislators around the world are grappling with how interventions to moderate content may cut across fundamental rights to free expression and access to information

To prioritize the growth of digital services and protect free expression rights in this space, US and EU legislators enacted special legal regimes that offer digital platform providers with protections from liability for the content posted by users on their platforms (i.e., user-generated content). Reflecting the constitutional commitment to free speech, the US Congress enacted Section 230 of the Communications Decency Act (CDA) with broad immunity for digital platforms with respect to user-generated content on their platforms and a platform's content moderation decisions. The EU established a "safe harbor" regime under the E-Commerce Directive, with a more limited concept of immunity: once a digital platform provider becomes aware of illegal user-generated content, it must act expeditiously to remove that content, or it may face liability.

The importance of digital platforms has increased exponentially since these laws were passed. The coronavirus pandemic has amplified the importance of digital platforms in all aspects of our economy and society, and our ever-dominant reliance on digital platforms has become the source of new risks and challenges.

Legislators around the world are now revisiting legal regimes relevant to digital platforms – grappling with how they must detect and intervene against content such as hate speech, misinformation, or harassment, and how such interventions against content may cut across fundamental rights to free expression and access to information. For example, the US Congress has proposed reforms of CDA Section 230, with more than 20 proposed bills in 2021 alone, to distance itself from the largely blanket immunity it currently grants digital platforms. The European Parliament recently approved the draft text of the Digital Services Act (DSA), which establishes a harmonized, EU-wide framework for intermediary liability and imposes new obligations on digital platforms with the aim to "better protect consumers and their fundamental rights online."

Surprisingly, while other regions are reforming and updating their rules to keep up with evolving technologies, Latin American countries continue to stay largely silent when it comes to legislating in this space, even though the number of people using the internet across Latin America surpassed 500 million in 2021. It was not until 2014 that Brazil became the first—and only—country in the region to offer a comprehensive law covering the obligations and liabilities of digital platforms. While there have been several legislative proposals on intermediary liability in other Latin American countries, these proposals have largely failed. The rest of the region has only seen a patchwork of narrow sectoral laws or judicial legislation largely reliant on general civil, criminal, intellectual property, and consumer protection laws that for the most part predate the existence of digital platforms. This has resulted in an inconsistent and piecemeal approach to intermediary liability throughout Latin America – which brings operational uncertainty for digital platforms providing global products and services.

Brazil: A pioneer for intermediary liability regimes in Latin America

500+ million

Internet users across Latin America in 2021
Source: Market data

30% to 50%

A large portion of the overall population in Latin America is still underbanked, ranging from 30% to 50% across major countries
Source: Market data

Brazil is the only country in Latin America that has enacted a comprehensive intermediary liability regime. In 2014, it adopted the "Marco Civil da Internet" (the Brazilian Internet Bill of Rights), which established principles, rights, and obligations for internet use. The Marco Civil creates a strong "safe harbor" regime guided by judicial oversight.

In general terms, digital platforms can only be liable for user-generated content on their platforms if they fail to comply with a court order requiring them to remove content within the specified timeframe. The law has also been interpreted to protect digital platforms from liability when they engage in content moderation consistent with the terms and policies presented to users. Interestingly, the safe harbor principle of the intermediary liability regime was found to be unconstitutional by a lower court, with a pending appeal to the Brazilian Supreme Court, but it still remains in effect despite the lower court's ruling.

In addition to the Marco Civil, there has been a recent increase in legislative and executive proposals focusing on the digital platforms' ability to moderate content. The two most prominent proposals – Bill 2630/2020 (known as the "Fake News Bill") and the Federal Executive Branch's Provisional Measure No. 1.068/2021 (the "Executive Provisional Measure") – presented diametrically opposite compliance obligations.

As first proposed, the Fake News Bill would impose, among other things, moderation obligations on digital platforms as it pertains to "misinformation," and a transparency requirement, similar to that of the DSA, requiring digital platforms to publish regular reports on content moderation. The Executive Provisional Measure, on the other hand, aimed to limit digital platforms' ability to moderate content without first obtaining a court order. The measure was quickly rejected by the Brazilian National Congress and struck down by the Federal Supreme Court as unconstitutional. The future of the Fake News Bill is also uncertain, given the political climate and recent election.

Despite the uncertainty of these proposals, it remains clear that the future of intermediary liability and digital platform obligations is in flux given the Congress' and the Federal Executive's desire to introduce reforms.

Argentina & Colombia: Establishing intermediary liability principles through case law

Surprisingly, while other regions are reforming and updating their rules around digital platforms to keep up with evolving technologies, Latin American countries continue to stay largely silent when it comes to legislating

Argentina has yet to pass comprehensive legislation regarding intermediary liability and, apart from a few early proposals on narrow sectoral laws, appears far from legislating a full-scale framework in this space. Legislative bills on content moderation have been considered over the years, but none have been ratified, and there is no current legislative proposal on debate for a broad intermediary liability regime.

The Argentine courts have nevertheless stayed busy filling the legislative gap. In 2014, the Supreme Court of Argentina effectively created a "safe harbor" regime that borrows components from the Brazilian and EU approach. In Rodriguez, Maria Belen Rodriguez v. Google, Inc., the court ruled that digital platforms cannot be held liable for user-generated content unless they have "actual and effective knowledge" of the content, and thereafter fail to diligently block or remove it.

The Supreme Court's approach differs based on the nature of the content: a mere notice or demand for removal received by a digital platform is sufficient to establish actual and effective knowledge for clearly unlawful content (e.g., child pornography, content facilitating crimes, and racial prejudice). For other content, such as defamation and trademark violations, a court or administrative authority's takedown order is required to establish knowledge. Although the latter requirement aims to protect digital platforms from having to undertake difficult analyses that require weighing countervailing fundamental rights under local law, it leaves open substantial gray areas as to what constitutes clearly unlawful content, therefore opening the door for long-lasting litigation. The Supreme Court's 2014 decision is not an outlier; the court reaffirmed this legal standard in 2017 and again in 2021. Although Supreme Court decisions in Argentina are not binding precedent in the same way as in the common law countries, lower courts in Argentina usually take them into consideration when adjudicating matters across the gamut of digital platforms.

The Colombian legislature similarly has yet to pass legislation addressing intermediary liability, and there are presently no relevant bills up for formal debate. However, given the implications on fundamental rights of free expression and access to information, the Constitutional Court has had the opportunity to consider and has found, on numerous occasions, that digital platforms are not responsible for user-generated content on their platforms.

Colombian case law has arisen in the context of constitutional actions known as tutelas – a unique and expedited form of litigation seeking to cure an alleged breach of fundamental rights – where petitioners usually do not seek damages, but rather injunctive orders (e.g., to remove or re-instate content on a platform). In the context of such matters, the Constitutional Court established the following principles: first, courts (rather than digital platforms) should determine the legality of content; second, digital platforms should comply with a state authority's order seeking the removal of content from their platforms in a timely manner; and third, there is no general obligation for digital platforms to pre-screen or proactively monitor content on their platforms. While the Colombian judiciary is yet to consider these principles in the context of a claim for damages, one would expect them to translate equally in that context.

In the absence of legislation codifying intermediary liability, litigation will continue to test the bounds of prior precedents. The future of this court-based intermediary liability framework will also remain less certain without its codification, given things like the ever-evolving nature of digital platforms, the possibility of a change in domestic circumstances (as we've seen with misinformation in the COVID-19 pandemic), or a change in the make-up of courts can abruptly change the status quo.

Mexico: far from embracing a uniform intermediary liability framework

658 million

The total population of Latin America & Caribbean in 2021
Source: Statista

Despite Mexico's economic influence in the region, it lags in establishing intermediary liability principles. A controversial draft bill attempting to regulate social media was proposed by a Mexican senator in 2021, but faced significant public resistance and did not pass. No other bills on intermediary liability or online content moderation have since been proposed. Moreover, unlike Argentina and Colombia, Mexico is, somewhat surprisingly, yet to have a seminal case from its Supreme Court establishing intermediary liability rules in the absence of specific legislation.

The United States-Mexico-Canada Agreement (USMCA) may soon provide more certainty. Already, as a result of the USMCA, a notice-and-takedown system was introduced into Mexico's Federal Copyright Law in July 2020. Importantly, the USMCA also imposes on its signatories an express limitation from "adopt[ing] or maintain[ing] measures" that conflict with the intermediary liability principles under CDA Section 230 – namely, that digital platforms are not to be held liable for user-generated content published on their platforms (with limited exceptions); and, any action voluntarily taken in good faith by digital platforms to moderate its platform from harmful or objectionable content. This provision is set to apply to Mexico starting in July 2023 – three years after the agreement enters into force.

Whether Mexico actually passes legislation to align with these principles is yet to be seen, given the USMCA provides that "a Party may comply with this Article through its laws, regulations, or application of existing legal doctrines as applied through judicial decisions." At a minimum, the USMCA provides greater future certainty that either legislation (should Mexico go the route of Brazil) or judicial decisions (should it follow Argentina and Colombia) will need to align with the principles of intermediary liability that are generally favorable to digital platforms and free expression.

Peru: Possibly following Brazil with an intermediary liability framework

At the end of 2021, the Peruvian Congress introduced a bill that, among other things, would introduce an intermediary liability regime, shielding digital platforms from liability for user-generated content. If approved, the bill, as currently drafted, would make it clear that digital platforms are not required to pre-screen or proactively monitor user-generated content that is transmitted, stored, or referenced on their platforms. A digital platform would only be obliged to remove user-generated content if it receives a court order to do so, and could not be held liable for user-generated content unless it failed to comply with such the order. The newest version of the bill was introduced in June 2022, and it is not expected to pass earlier than December 2022. Its success largely depends on the political will of the Peruvian Congress.

With the exception of Brazil, countries in Latin America are still far behind their North American and European counterparts, which have established, and are now working to amend, laws that provide intermediary liability frameworks, guidelines, and obligations. The absence of intermediary liability frameworks in Latin America has resulted in a lack of certainty, which can prove detrimental to both digital platforms and internet users. While courts in some Latin American countries have tackled the absence of legislation by establishing general rules and principles of intermediary liability in relation to search engines or social media, the development of new forms of technology and new interactive spaces, like the metaverse, may leave a renewed void that could take years, if not decades, to close through the judiciary.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© White & Case LLP | Attorney Advertising

Written by:

White & Case LLP
Contact
more
less

White & Case LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide