Social Links: The Internet’s Awkward Teen Phase

Morrison & Foerster LLP - Social Media

Hey Kid – Go Make Some Real Friends

By the end of November, Character.AI will stop letting anyone under 18 chat freely with its bots. No more late-night conversations with virtual friends, no more two-hour therapy sessions with imaginary Freud. The company says the change is about safety and “emotional dependency.” Maybe it is. But the timing, shortly after a lawsuit blamed the platform for a teenager’s suicide, suggests something more pragmatic. Lawyers have entered the chat.

Character.AI built its reputation on intimacy. Users could “talk” to anyone from Einstein to Taylor Swift to a custom-built romantic partner who remembered everything they said. An awful lot of teens did exactly that, often when the human world felt too difficult to reach.

The legal angle is starting to catch up to the emotional one. Courts have long treated platforms as passive conduits under Section 230, shielding them from responsibility for user content. These new lawsuits seek to argue that the design itself can cause harm. An algorithm optimized for engagement, the argument goes, can cross from communication to manipulation. That’s the same theory driving a growing crop of state-level youth safety laws in Colorado, Utah, and elsewhere. Character.AI just happens to be the first major platform to act before a judge forces the issue.

Age verification will be its own challenge. The company plans to rely on ID uploads and behavioral signals to separate minors from adults, which raises a new set of privacy and discrimination questions. Even then, underage users will still be able to create characters and revisit old chats. Think of it as a sort of “partial ban.” The restriction applies only to live, open-ended conversations; other modes and saved interactions stay on the table.

But the symbolism matters. The era of carefree digital companionship is fading fast. What comes next may feel stranger and more dystopian. Think more paperwork, more disclaimers, more pop-ups reminding us that our robot friend is, legally speaking, a product. And products, as every lawyer knows, come with warnings.

Emojis, Exhibits, Evidence

A rocket ship emoji used to mean optimism. Now it might mean “see you in court.”

Bloomberg Law recently reported that emojis are starting to appear in securities-fraud complaints, not as colorful filler but as evidence of intent. Regulators and judges are reading them like tea leaves. Your favorite blog has, unsurprisingly, been on top of this for years.

In one filing, the familiar 🚀 📈 💰 trio was cited as proof that promoters weren’t just enthusiastic, they were promising actual returns. It’s the Howey Test rendered in miniature, a cartoon rocket with “expectation of profit” as its payload.

For anyone fluent in chat culture, that’s a hard shift. A single emoji can signal tone, sarcasm, or comic relief. But when it shows up in a business thread, that tiny wink can suddenly look like a disclosure violation. The SEC’s recent focus on “off-channel” communications means texts, posts, and chats now fall under the same compliance umbrella as formal investor decks.

There’s no official emoji dictionary yet, which means context rules the day. A smiley face could mean humor, irony, or approval. But context is slippery, and hindsight makes every chat look like foreshadowing. Justice Potter Stewart once said he knew obscenity when he saw it; regulators now seem to know intent when they scroll past it.

The safest approach is to assume every emoji speaks. Compliance teams should include them in training, and legal teams should expect to see them in evidence. It’s a small cultural leap with big consequences, because those casual thumbs-ups and rocket ships are discoverable, timestamped, and eternally searchable.

Enthusiasm isn’t illegal, but it does leave receipts. Act accordingly.

Section 230 Takes a Big Ol’ Sip of Texas Tea

Texas is back in court over its latest attempt to “secure children online.” This week, the state asked the Fifth Circuit to revive or uphold parts of its SCOPE Act, a law that would require social media platforms to verify users’ ages, filter “harmful” content for minors, and block targeted ads unless a parent signs off.

Those rules sound straightforward until you read the fine print. A federal judge already put them on ice, finding the law too broad and too vague to survive First Amendment scrutiny. In his injunction, Judge Robert Pitman said Texas had effectively given itself the power to decide which ideas teenagers are allowed to discuss online.

Texas sees it differently. At oral argument, Assistant Solicitor General Cameron Fraser told the Fifth Circuit that the lower court took “far-fetched hypotheticals” too seriously and ignored the Supreme Court’s 2024 NetChoice v. Moody decision. That ruling made it harder to win facial First Amendment challenges unless a law is unconstitutional in most of its applications.

But the panel’s questions kept circling back to the old legal warhorse we affectionately call Section 230. Judge Pitman found that forcing platforms to screen and suppress harmful content makes them responsible for user posts. That’s what Congress theoretically meant to prevent when it passed Section 230 back in 1996. Lawyers for the tech-industry challengers said the SCOPE Act would “commandeer private websites to monitor and filter speech on the state’s behalf,” blurring the line between publisher and platform.

Texas countered that its limits on targeted advertising regulate commercial conduct, not speech. The plaintiffs disagreed, pointing out that the law never defines “advertising” and could sweep in things like public service announcements or advocacy campaigns.

The Fifth Circuit’s decision will matter well beyond Austin. States from Arkansas to California are testing similar “child-safety-by-design” laws that force platforms to re-engineer how they moderate and monetize. Whether those efforts can coexist with the federal immunity that built the modern internet will be one of the defining questions of the next wave of tech regulation.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Morrison & Foerster LLP - Social Media

Written by:

Morrison & Foerster LLP - Social Media
Contact
more
less

What do you want from legal thought leadership?

Please take our short survey – your perspective helps to shape how firms create relevant, useful content that addresses your needs:

Morrison & Foerster LLP - Social Media on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide