Social Links: Darkness on the Edge of Town

Morrison & Foerster LLP - Social Media

FTC Targets Sendit over Alleged “Dark Pattern” Design

The Federal Trade Commission (FTC) has made clear that youth-focused social apps are a priority. On September 27, the FTC filed a complaint in the Central District of California against Iconic Hearts Holdings, the company behind the anonymous messaging app Sendit, and its CEO, Hunter Rice. The case accuses the company of collecting data from children under 13 without parental consent and misleading users about paid subscriptions.

Sendit is one of those apps that gained traction by promising a bit of mystery. Users could send and receive anonymous messages, then pay to learn who was behind them. The FTC says that the “Diamond Membership” feature, which cost up to $9.99 per week, was marketed as a one-time upgrade but quietly rolled into a recurring charge. In many cases, subscribers just got a vague hint about the sender’s device or location, not the full reveal they were promised.

The agency also alleges violations of the Children’s Online Privacy Protection Act (COPPA), the FTC Act, and the Restore Online Shoppers’ Confidence Act (ROSCA). In short, the complaint says Sendit collected personal data from children and used design choices that nudged teens into recurring payments they didn’t fully understand. The company and its CEO now face a set of overlapping claims that touch both privacy compliance and consumer deception.

If this sounds familiar, it’s because the FTC has been ramping up enforcement on both fronts. Over the past year, the Commission has settled COPPA and dark-pattern cases with major tech companies and smaller app developers alike. The Sendit case fits that pattern: a youth-oriented platform that allegedly blurred the lines between engagement and exploitation.

Compliance teams have navigated similar issues before, but the Sendit case brings the pattern into sharper focus. It shows how youth-oriented design, recurring billing, and blurred consent boundaries can intersect in ways that catch regulators’ attention. COPPA and ROSCA may be a bit long in the tooth now, but the FTC is clearly willing to use them as tools to regulate new platform behavior. For companies in this space, privacy, billing, and product design could very well become part of the same conversation.

New Jersey Court Narrows Section 230 Shield for TikTok

On September 30, a New Jersey Superior Court judge refused to let TikTok off the hook in the state’s youth-harm lawsuit, rejecting the platform’s defense under Section 230 of the Communications Decency Act. It’s a reminder that the shield protecting platforms for what users post doesn’t always extend to how those platforms operate.

TikTok’s argument was a familiar one: “We just host content, we don’t make it. Section 230 says we’re immune!” But Judge Lisa M. Adubato wasn’t buying it. The complaint zeroes in on the app’s own design and recommendation systems (that legendary TikTok algorithm). Those, she said, are design choices, not user speech.

This is a meaningful line in the sand. Courts are increasingly distinguishing between liability for user-generated content, which Section 230 still covers, and liability for platform behavior, which it doesn’t. Earlier this year, the Third Circuit reached a similar conclusion in Anderson v. TikTok, holding that recommendation algorithms aren’t passive conduits but active participants in what users see.

For the time being, Section 230 still provides platforms with immunity to claims based squarely on content posted by users. But when states frame their suits around addictive design, algorithmic amplification, or failure to warn, the question shifts from “What did a user post?” to “What did the company build?” And more courts are signaling that they’re receptive to this distinction.

Dark Rocks, Gray Markets

A few seconds of video can move a billion-year-old rock. In northern Mauritania, meteorite hunters are turning to social media to sell fragments of space debris that fell to Earth long before modern law existed to govern it. The trade is brisk, lucrative, and almost entirely unregulated.

That absence of law isn’t an oversight; it’s the norm. Few nations have ever defined who “owns” a meteorite. In some countries, it belongs to whoever finds it. In others, it belongs to the state or the landowner. Mauritania’s statutes are nonexistent, leaving a gray zone that’s now being monetized one viral clip at a time. International conventions like UNESCO’s 1970 treaty on cultural property doesn’t clearly cover meteorites, which fall somewhere between natural heritage and scientific specimens. The result is legal ambiguity that’s easy to exploit and nearly impossible to police.

Social media has filled the gap that law left open. TikTok traders use short videos to showcase shiny “space stones,” while buyers in Europe, China, or the United States bid through comment threads or encrypted chats. Platforms are not auction houses, but in practice they’ve become the market’s infrastructure, providing visibility, payment routing, and a thin veil of legitimacy. The darker a stone looks, the higher the price, so some sellers have started wetting chondrites or applying image filters to deepen the hue and mislead distant buyers. What was once a question of provenance is now a question of pixels.

It’s a modern version of older gray markets. In the 1990s, looted antiquities from post-Soviet museums circulated through faxed catalogs and weekend trade fairs, a boom that helped spur enforcement under the U.S. Cultural Property Implementation Act and prompted tighter UNESCO protocols. Before that, the global wildlife trade relied on mail-order price lists until CITES began regulating cross-border animal product sales. Each era’s technology advanced faster than its enforcement, and each forced lawmakers to play catch-up.

Mauritania’s customs agencies cannot chase every outbound rock, and in some jurisdictions platforms can claim immunity under the same laws that shield them from liability for other types of user-generated content, such as Section 230 of the Communications Decency Act in the United States. But unregulated marketplaces that turn physical scarcity into digital dollars are hardly new. The product may be literally extraterrestrial, but the legal questions are firmly Earthbound. Pondering who owns what, who profits, and who is responsible for stewardship is far less exciting than gazing wide-eyed at a billion-year-old rock that orbited Uranus before crashing into the Sahara, but these are questions that will need to be answered eventually.

The meteorite trade might seem like a niche curiosity, but it is a perfect case study in the collision of discovery, technology, and law. Until legislators write rules that reach beyond the Earth, TikTok will remain the planet’s busiest meteor market.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Morrison & Foerster LLP - Social Media

Written by:

Morrison & Foerster LLP - Social Media
Contact
more
less

What do you want from legal thought leadership?

Please take our short survey – your perspective helps to shape how firms create relevant, useful content that addresses your needs:

Morrison & Foerster LLP - Social Media on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide