AD-ttorneys@law - February 2023

BakerHostetler
Contact

BakerHostetler

Game Developer Dodges Loot Box Suit

In-app epi-games are not the same as slot machines, court says

Again with the Noises and Bright Lights

It’s been a while since we reported on defendant Supercell’s successful dismissal of a class action brought by one Peter Mai — a year and change, to be precise, since the Northern District of California killed the first case while leaving a door open for Mai to attack once more.

Well, Mai took another shot, and missed again.

The original complaint, filed in 2020, was interesting. Mai accused Supercell of designing the loot boxes* in its games to have the same appeal as real-life slot machines (the apps are Clash Royale and Brawl Stars; see here to get a feel for them) while making the chance of winning anything in the games nearly zero. The gambling wasn’t disclosed, Mai alleged, which violated California’s anti-gambling laws, Unfair Competition Law and Consumer Legal Remedies Act.

QQ’d

The court dismissed the case in September 2021. As we wrote then:

“Mai’s UCL, CLRA, and unjust enrichment claims are all based on the fundamental premise that the loot boxes are illegal ‘slot machines or devices’ under California Penal Code 330b,” the court held in a recent order. But because games are “games of skill” — in other words, competitive in a way pure gambling games are not — he could not state a claim that it is a slot machine.

Mai’s replay — an amended complaint filed in October 2021 — led to the same result. This was mainly because, according to the court, he didn’t seem to adjust his claims.

“Plaintiffs’ additional allegations in the FAC amount to the conclusory statements that they ‘lost money and property by purchasing loot boxes’ with real-world currency and virtual currency, with the virtual currency constituting lost property when used to pay for loot boxes,” wrote the court in its order in early January.

However, “Plaintiffs do not allege a deficiency in the loot boxes or virtual currency that they received in exchange for their real-world currency, or in the loot boxes they received in exchange for their virtual currency. Nor do they allege that they received a lesser amount of either type of item than they were promised. Where, as here, a party obtains exactly what they paid for, there is no cognizable economic injury for the purposes of UCL standing.”

The Takeaway

This decision, which draws on Google v. Ginsberg, seems to put a spike in cases comparing loot boxes to slot machines, at least under California law. Numerous cases against app developers and distributors (like Google) have died on this hill. How many more times can the argument be made?

Nonetheless — and who in the law doesn’t love a good “nonetheless”? — we return to the advice we offered after reading Mai’s original complaint: “If you’re designing an app that’s going to involve in-app purchases like [loot boxes], disclose, disclose, disclose — and consider how you’re designing the rewards you’re offering to users.”

Especially children, which Mai made the center of his original case. It can’t hurt to play it safe. You’ll get to add the cool bells and whistles to your online offering, and maybe skip the court costs.

Innovative Crypto Company Triggers Old-School False Ad Suit

Claimed security measures on par with stodgy old banks

Click-Hole

“20 million hits in one minute” may be the modern marketing professional’s equivalent of nirvana, the ecstatic realm of complete fulfillment that absorbs the initiate.

Coinbase achieved this bliss during the 2022 Super Bowl, when its odd commercial — consisting of nothing more than 60 seconds of a QR code meandering around the screen like a DVD logo of old — apparently turned portions of the American public into mesmerized cats, doomed to scan the code and arrive at Coinbase’s $15 bitcoin offer/sweepstakes.

Those 20 million hits nuked the Coinbase servers, of course, but never you mind — it was a home run of an ad for the cryptocurrency exchange, even more impressive for the fact that it was launched at the height of crypto-hype.

Dude, Where’s My Lambo?

The excitement that dominated 2021 is now decidedly a thing of the past, as countless hungover cryptobros can attest. Scandals and missteps by a few cryptocurrency companies and exchanges — not to mention frenetic activity around NFTs — have brought an old-fashioned ad-law focus on some industry players.

Consider a recent class action complaint lodged against Coinbase in California Superior Court, San Francisco, which seeks injunctive relief “to protect members of the general public from dangerous misrepresentations made by [Coinbase], the country’s largest cryptocurrency investment platform.”

There’s nothing in the suit that’s unique to the underlying technology. This is a plain-old false advertising and unfair conduct case under California law.

The statements at issue involve security claims. According to plaintiffs Mostafa El Bermawy, Amish Shah and Manish Aggarwal, Coinbase bragged that it offered consumers “bank-level security” while doing a “poor job of protecting its user accounts from intrusion and [offering] much feebler protection than the security provided by banks and other crypto businesses.”

If the plaintiffs’ accusations are true, the “poor job” involved hundreds of thousands of dollars lost to hackers.

The Takeaway

The specific false and misleading statements alleged by the plaintiffs are too numerous to list here — they take up eight pages of the complaint — but many of them hang their hat on the similarities between Coinbase’s security protocols and those deployed by banks and other financial institutions.

But the complaint points to a 2023 consent order Coinbase entered into with the New York State Department of Financial Services “in which Coinbase agreed to pay $100 million for its failures to detect, prevent, and report illegal activity on its platform, including fraudulent transfers.

“The Department found that Coinbase failed to maintain adequate systems to monitor its customers’ accounts for suspicious activity and prevent abuse, in violation of state and federal law.”

We’ll see how Coinbase responds to — or refutes — the new allegations, but to us, the takeaway is clear: Do not claim an industry standard that you cannot meet — or have not met.

Additionally, if you are not part of the industry — or exist on the fringes of it — it may be dangerous to claim you’ve met its standards when there’s no apples-to-apples comparison with established methods and protocols.

So don’t do it.

Bored Ape Copycat Gets Anti-Anti-SLAPPED

Provocateur Ryder Ripps’ good intentions don’t constitute artistic expression

Irony Cubed

Here’s a quote for you:

“There has to be a moment when people are having a non-performative discussion — meaning all the layers of irony, the layers of double-meaning and emotions are taken out. Then, we’re just talking about the meaning and the impact and the purpose and the intention of things, almost with the sobriety of a legal setting.”

Those words were not spoken by a sober opinion writer in the pages of the Old Grey Lady or a talking head making yet another plea for civility and common sense on cable TV. No, they were spoken by Ryder Ripps, conceptual artist and internet provocateur extraordinaire. Ironically (there’s that word again), Ripps is famous precisely because of his own record of irony-drenched internet-inspired or -adjacent controversy. Our favorite story concerns how he trolled the CIA.

(You can read about Ripps, the CIA story and other wild tales, and the background of the case we’re discussing in detail here.)

Ripps’ surprising cri de coeur was meant to make a case for positive trolling — an attempt to call attention to our lack of attention or to unmask the sinister aims of an unchallenged opponent. And the “legal setting” mentioned at the end of his quote isn’t “almost” anything — his latest provocation landed him in court.

Best Intentions

On a subjective level, there’s something admirable in the behavior that put Ripps on the business end of a trademark infringement claim.

The artist and his co-defendants claim to have noticed similarities between NFT designs created and minted by Yuga Labs — specifically, certain Bored Ape Yacht Club NFTs — and “racist, neo-Nazi, and alt-right messages and imagery.” One of the examples was a BAYC NFT that bore a resemblance to the sigil of the dreaded Waffen SS.

Ripps responded by posting his evidence — and launching his own series of tokens pointing to the BAYC tokens, thereby pointing up the supposed racist underpinnings of the original.

Ripps justified his creation of the NFTs as “a form of ‘appropriation art’ that serves several purposes, including: (1) bringing attention to Plaintiff’s use of racist, neo-Nazi, and alt-right messages and imagery; (2) exposing Plaintiff’s use of unwitting celebrities and popular brands to disseminate offensive material; (3) creating social pressure demanding that Plaintiff take responsibility for its actions; and (4) educating the public about the technical nature and utility of NFTs.”

You can see an example of the three tokens, side by side, here.

Octo-Suit

Yuga Labs sued last summer for trademark infringement in the Central District of California; Ripps responded by firing off an anti-SLAPP motion to strike along with a motion to dismiss arguing that the appropriation of BAYC’s original images was protected by free speech under the Rogers test. The court disagreed with both attempts to shut down the case.

In that order, the court noted that a “‘collection of NFTs that point to the same online digital images as the BAYC collection’ is the only conduct at issue in this action” and that the reference of Ripps’ NFTs to the original BAYC tokens didn’t invoke First Amendment concerns. With this defense out the door, Ripps’ anti-SLAPP motion likewise failed.

While waiting for an appeal, Ripps countersued; according to The Fashion Law, Ripps reiterated his Rogers free speech claims and accuses Yuga Labs of “‘knowingly and materially misrepresent[ing]’ its rights in the BAYC NFTs, while engaging in an ‘outrageous retaliatory campaign’ against them, including by ‘lying about’ – and ‘intimidating and threatening’ – them.”

Guess what? Yuga Labs moved to dismiss that complaint.

In an interesting wrinkle, Ripps’ counterclaims include allegations that Yuga Labs lacks copyright claims to the BAYC images because they were created using an image-generating algorithm. According to The Fashion Law, Ripps alleges that “Yuga used an automated computer algorithm to produce most if not all BAYC images” and that “a judgment from the court declaring that the BAYC images ‘are not entitled to copyright protection’ is warranted even if Yuga Labs did not make copyright infringement claims in the suit at hand.”

Remember — there are two cases now arising from the same facts, one of which is being appealed while the other is just getting started.

We get the feeling this epic tale will go on and on.

The Takeaway

The many tendrils of this case are yet to be resolved. We’re particularly interested in how the court handles the copyright questions arising from AI-generated imagery.

But — if we may return to the basic issue of the case — our advice isn’t for the Ryder Ripps out there — he had to know that his prank was risky. Instead, it’s for anyone crafting original NFTs (if, indeed, anyone still cares): Do your research.

Crafting iconography — or any form of artistic expression — bears its own risks. Let’s assume that Yuga Labs was innocent when it created the NFTs in question. Then why didn’t someone in its IP department question its choices when it was pumping out ape designs with inadvertent fascist echoes?

And, finally, we have to note the greatest irony of all: NFTs are blockchain-built. Wasn’t the underlying technology designed specifically to ensure unmistakable ownership? As one commentator put it:

“What’s weird is that the implementation of NFTs is supposed to be a verification mechanism but it doesn’t look like it’s doing its job very well … The idea is that you can use the token address as a verification code, but [BAYC] doesn’t have a list of these codes for people to cross-check. So to examine this, you would have to go to Etherscan and look through every single trade from BAYC. With that much effort required, I bet many people either don’t know or don’t bother to check it.”

Twitter Accused of … Getting Hacked

But is that the same as betraying its users?

Well, Duh

“At no point does Twitter disclose in their Privacy Policy that they allow cybercriminals to commandeer Twitter’s API in order to scrape sensitive PII from Twitter and to then weaponize or sell that information on the dark web,” says a class action freshly filed in the Northern District of California. An “API,” or Application Programing Interface, is a way for computer programs to communicate with one another.

The plaintiff, New Yorker Stephen Gerber, is a victim of a “scraping” attack, whereby hackers exploited a flaw in Twitter’s API and gathered “usernames, email addresses and phone numbers” from users. According to Gerber’s account, that information is now being posted on the “dark web,” that subterranean domain of hackers, scammers and dealers in all manner of creepy things.

Problem is, Gerber doesn’t offer any evidence that his or anyone else’s information has actually been posted anywhere. And while we might assume, in this day and age, that all the information has indeed been posted everywhere, is that assumption the same as demonstrating harm?

Gerber also claims that the breach violates the terms of a 2011 agreement between Twitter and the Federal Trade Commission. “[Twitter] shall not misrepresent in any manner, expressly or by implication, the extent to which [Twitter] maintains and protects the security, privacy, confidentiality, or integrity of any nonpublic consumer information,” the complaint states.

The Takeaway

But there seems to be some substance missing from the argument about what constitutes negligence, both in terms of the legal arguments and the FTC agreement. Of course, Twitter didn’t disclose that it planned to allow cybercriminals to take over its API. Had the company done so, would it have been exempted from legal action?

You see, there’s plenty of noise in the complaint to the effect that Twitter failed its users. “The [Personally Identifiable Information] of Plaintiff and the Class was lost and accessed as the proximate result of Defendant’s failure to exercise reasonable care in safeguarding such PII by adopting, implementing, and maintaining appropriate security measures,” Gerber states. But nowhere does he establish what that reasonable care should have been, or how Twitter fell short.

Is Twitter’s failure to repel the attack a violation of its responsibility to exercise reasonable care? Does it amount to misrepresentation?

We get the feeling that this complaint is going to be chewed over quite a bit by the court and opposing counsel, if it even goes anywhere at all.

Flo Rida Flexes Influencer Muscle with Trial Win

Beverage maker Celsius needs to pour out $82 mil to rapper

State of Confusion

Here’s how out of it we are.

It wasn’t until we were assigned this story by the fearless AD-ttorneys@law editorial oversight board of governors that we realized that rapper Flo Rida’s stage name was simply “Florida” with a space between the “o” and the “r.”

We had no idea. But then we began reading about his recent win in Florida’s 17th Judicial Circuit Court, and there it was, suddenly as clear to us as the sunshine in the state nickname — the most obvious pun of all time.

However, we must confess that once we shook the cobwebs from our heads and got back to the case at hand, we went right back to being befuddled.

Don’t Know How to Act

The complaint itself is light on stock option amounts and dollar figures, but we think we understand the outline of the case. Flo (can we call him that?) entered into a contract with Celsius, a beverage company, back in 2014.

In exchange for endorsement and brand ambassadorship, the rapper was given an initial cash and stock package and incentivized to hype the energy/workout drinks with product sale and unit goal bonuses — a supplement of 750,000 shares in the company if the goals were met. 

Mr. Rida — we’d refer to him as Tramar Dillard, but his real name just isn’t as cool — claimed that the company had hit the product and unit sales that triggered his supplemental payment. The problem was, according to Rida, that the company failed to “properly calculate and pay royalties” and “concealed material facts” relating to the disposition of the agreement.

In present-day dollars, Rida’s supplemental 750,000 shares amounted to a cool $83 million. The case, filed in May 2021, went to trial, and a jury awarded him the full amount.

The Takeaway

Celsius made some noise at trial about the fact that Celsius’ success, which drove up the price of its stock, was not due to the rapper’s endorsement, but that argument clearly fell on deaf ears. A stock is a stock. If it was promised as compensation too early, that doesn’t change the fact that its value can rise or fall for any number of reasons.

No, the real takeaway from the case is Flo Rida’s success in the courtroom.

The influencer has come of age as a financial stakeholder in the success of his or her clients. When you’re cutting a deal with your favorite TikTok or YouTube celebrity, make sure you’re clear about what you’re promising them, and then stick with the agreement.

There will be another fresh crop of influencers who will be watching how you treat their forebears.

Check Out Our Latest Blog Posts

Latest FTC Health Privacy Case Sheds Light on Agency Health Privacy Approaches

Health privacy has been a Federal Trade Commission (FTC) priority for decades, and indeed, one of its very first privacy cases, in the early 2000s, involved the inadvertent sharing of user health data. Fast-forward a few decades, and health privacy remains a major concern. Case in point: The latest FTC privacy enforcement action focuses on the practices of GoodRx and is the first FTC case to allege a violation of the Health Breach Notification Rule (HBNR or Rule). This enforcement action should serve as a warning shot to companies dealing in health information, reminding them that just because they do not fall under the Health Insurance Portability and Accountability Act (HIPAA) does not mean they are free to use the data they collect without potential regulatory consequences.

Is It Just a Puff?

My family knows that I get grumpy if we get to the theater after the previews have started, as previews are among my favorite parts of the in-theater versus stream-from-home experience. Yet many moviegoers may feel that the trailer didn’t accurately reflect the movie as a whole. But does such a feeling of disappointment rise to the level of a legal violation? A recent court decision involving a class action over the trailer for the movie “Yesterday” provides a good example regarding the difficult line between a claim and a puff.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© BakerHostetler | Attorney Advertising

Written by:

BakerHostetler
Contact
more
less

BakerHostetler on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide