Welcome to this month’s issue of The BR Privacy & Security Download, the digital newsletter of Blank Rome’s Privacy, Security & Data Protection practice.
RECENT HIGHLIGHT
CJEU Refines Scope of ‘Personal Data’ in Pseudonymization Process for EU Institutions
Blank Rome partners Sharon R. Klein, Alex C. Nisenbaum, and associates Karen H. Shin and Sierra N. Lactaoen authored this alert discussing a recent and significant judgement addressing the interpretation of “personal data” and “pseudonymization” under EU data protection rules. Read Now »
STATE & LOCAL LAWS & REGULATION
States Engage in Investigative Sweep of Global Privacy Control Compliance: The California Privacy Protection Agency (“CPPA”), alongside the attorneys general of California, Colorado, and Connecticut, announced an investigative sweep involving potential noncompliance with the Global Privacy Control (“GPC”), a browser setting or extension that automatically signals to businesses a consumer’s request to opt out of the sale or processing for targeted advertising purposes of their personal information to third parties. As part of the sweep, the coalition is contacting businesses that may not be processing consumer requests to opt out of the sale of their personal information submitted via the GPC as required by law and requesting that those businesses comply. This sweep reinforces California’s, Colorado’s, and Connecticut’s 2025 Data Privacy Day educational efforts on the GPC.
California ADMT and Cybersecurity Audit Rules Get Final Approval: The CPPA announced the approval of comprehensive privacy regulations by the California Office of Administrative Law. These rules address cybersecurity audits, risk assessments, and automated decisionmaking technology (“ADMT”), with compliance timelines based on business size. Cybersecurity audit certifications are due between 2028 and 2030, depending on revenue. Risk assessments must begin by January 1, 2026. ADMT compliance starts January 1, 2027. For more detailed information about the CPPA’s ADMT and cybersecurity rules, see our client alert.
Oregon Attorney General Releases One Year Report on Oregon Consumer Privacy Act: The Oregon Attorney General released a report with the results from the first year of the Oregon Consumer Privacy Act (“OCPA”) taking effect. In the first year of enforcement, the Privacy Unit within the Civil Enforcement Division at the Oregon Department of Justice (“Privacy Unit”) received 214 complaints. The majority of complaints were about online data brokers, including websites that provide background reports for a fee. The top right that Oregonians have been requested and were denied was the right to delete their personal data. The Privacy Unit initiated and closed 38 matters, after sending notices to cure violations and broader information requests to companies. As of January 1, 2026, the 30-day cure period provided under the OCPA sunsets and controllers are prohibited from selling precise geolocation data and from selling the personal data of children under 16 and using such data for targeted advertising and certain types of profiling.
California Passes AI Chatbot Safety Law: The California Legislature passed Senate Bill 243 (“SB 243”), a first-of-its-kind-law that would require chatbot operators to implement safeguards relating to interactions with chatbots and provide individuals with a right to pursue legal action against noncompliant and negligent developers. Specifically, SB 243 would require providers of “companion chatbots”—artificial intelligence (“AI”) systems with a natural language interface that provide adaptive, human-like responses to user inputs and are capable of meeting users’ social needs—to provide clear and conspicuous notice that the companion chatbot is not a human and put in place a protocol for preventing the production of suicidal ideation, among other safety protocols. Companion chatbots do not include bots used only for customer service or other business operational purposes.
California Enacts AI Transparency Law: California enacted SB 53, the Transparency in Frontier Artificial Intelligence Act (the “Act”), requiring large AI developers (with annual revenues over $500 million) to publicly disclose safety frameworks for their frontier models. The law mandates clear publication of safety standards, reporting mechanisms for critical incidents, and whistleblower protections. It empowers the Office of Emergency Services to collect confidential risk assessments and incident reports, including model misuse or loss of control.
New York Attorney General Releases Proposed Rules for SAFE for Kids Act: New York Attorney General Letitia James released proposed rules for the New York SAFE for Kids Act, which is aimed at curbing addictive social media features that harm children’s mental health. The rules require platforms to restrict algorithmically personalized feeds and nighttime notifications for users under 18 unless parental consent is obtained. The proposed regulations detail how companies must verify users’ age and obtain parental consent. A 60-day public comment period is open until December 1, 2025. The law empowers the Attorney General to enforce compliance and seek penalties up to $5,000 per violation. The SAFE for Kids Act, signed into law in June 2024, complements the New York Child Data Protection Act, which limits data collection from minors.
California Legislature Passes Web Browser Opt-Out Requirement: The California Legislature passed Assembly Bill 566, the California Opt Me Out Act (“AB 566”), a bill that would require all web browsers to support opt-out preference signals (“OOPS”). Sponsored by the CPPA, the bill would allow consumers to limit the sale and sharing of personal data with a single browser setting. While some privacy-focused browsers already support OOPS, AB 566 would require universal access. If signed into law, California will become the first state to require browser support for OOPS, potentially setting a national precedent for consumer data protection. The California legislature sent a similar bill to Governor Newsom last year, but the governor vetoed that bill. Regulators have been stepping up focus on honoring privacy requests made through opt-out preference signals, which is currently a requirement in about a dozen state comprehensive privacy laws. Recently, the CPPA, along with the attorneys general of California, Colorado, and Connecticut, announced an investigative sweep into whether companies were adhering to their obligations to respond to consumers’ requests to opt out of the sale and sharing of their personal data via automated Global Privacy Control requests (See more detailed description above “States Engage in Investigative Sweep of Global Privacy Control Compliance”).
FEDERAL LAWS & REGULATION
U.S. Department of Defense Finalizes Cybersecurity Rule for Contractors: The Department of Defense (“DoD”) published its final rule amending the Defense Federal Acquisition Regulation Supplement (“DFARS”) to implement the Cybersecurity Maturity Model Certification (“CMMC”) program. Effective November 10, 2025, this rule applies to all DoD contractors and subcontractors—excluding those providing only commercially available off-the-shelf items—who process, store, or transmit Federal Contract Information or Controlled Unclassified Information. The CMMC framework provides three tiers of cybersecurity maturity and requirements depending on the types and sensitivity of information at issue. Contractors will have to post their CMMC status in the Supplier Performance Risk System at the time of being awarded a relevant DoD contract. Subcontractors will have to affirm continuous compliance and submit self-assessment results for review. The final rule may have significant implications for entities contracting with the DoD and compliance with these standards will be essential for securing new DoD contracts.
FTC Launches AI Chatbot Inquiry: The Federal Trade Commission (“FTC”) issued orders to seven companies that provide consumer-facing AI-powered chatbots seeking information on how these firms measure, test, and monitor potentially negative impacts of this technology on children and teens. The FTC is particularly interested in what actions these companies are taking to mitigate potential negative impacts, limit or restrict children’s or teens’ use of these platforms, or comply with the Children’s Online Privacy Protection Act (“COPPA”). As part of its inquiry, the FTC is seeking information about how the companies: monetize user engagement; process and respond to user inputs; develop and approve characters; disclose information to users and parents about features, risks, and data practices; monitor and enforce compliance with company rules and terms of services (e.g., community guidelines and age restrictions); and use or share personal information obtained through users’ conversations with the chatbots.
U.S. LITIGATION
Jury hits Google with $425M Verdict for Deceiving Users About Privacy Settings: A jury in San Francisco found Google liable for invasion of privacy and intrusion upon seclusion, awarding plaintiffs $425,651,947 in compensatory damages. Plaintiffs’ claims related to a Google account setting that claimed to allow users to opt-out of Google tracking their activity across third-party apps. However, Plaintiffs alleged that even after a user opted out of this third-party tracking, Google continued to collect and store information relating to their activity. Google argued that it anonymized any information it collected from individuals who had opted out and did not associate it with their accounts. While the jury found in favor of Plaintiffs’ invasion of privacy and intrusion upon seclusion claims, it rejected their claims under the California Comprehensive Computer Data Access and Fraud Act and federal wiretap act. Despite the large award, the jury also refused to award punitive damages. This case highlights the importance of clearly labeling opt-out options, especially where information will still be collected but only in an anonymized format.
California Attorney General Argues 23andMe Sale Violated California State Privacy Law: The state of California has launched another challenge to the proposed bankruptcy sale of genetics company 23andMe. The state of California filed an appellate brief arguing that the Missouri bankruptcy court erred in approving two Chapter 11 sales that transferred 23andMe’s database of consumer genetic profiles to TTAM Research Institute, a nonprofit started by 23andMe’s cofounder and former CEO. In its brief, the State of California argued that neither TTAM nor another affiliate that was first sold the data are good faith purchases under the Bankruptcy code. Instead, the State argued that this deal was structured to avoid 23andMe having to obtain consumer consent for the disclosure as would be required under the Genetic Information Privacy Act. The sale of 23andMe’s assets to TTAM, which has since renamed itself the 23andMe Research Institute, closed in July.
Ninth Circuit Upholds Most of California Social Media Addiction Law: The Ninth Circuit largely affirmed a district court ruling from December denying NetChoice LLC’s attempts to preliminarily enjoin California from enforcing the Protecting Our Kids from Social Media Addiction Act (the “Act”), or SB 976. The Act was enacted in September 2024 and was to take effect January 1, 2025. The Act prohibits social media platforms from providing “addictive feeds” to minors without parental consent, creates new parental control requirements, restricts when notifications can be sent to minors, and creates new transparency reporting obligations. NetChoice asserted that the Act would violate the first Amendment by unlawfully restricting the manner in which social media platforms communicate with minors. A California district court rejected this claim in December 2024. The three-judge Ninth Circuit panel determined that the district court did not abuse its discretion in finding that NetChoice, an internet trade association whose members include Google, Meta, and X, did not have associational standing on behalf of its members to launch its challenge to the law. It found that the First Amendment analysis is fact intensive and would likely vary from platform to platform, making NetChoice’s associational standing inappropriate. But, the panel did find that NetChoice is likely to prevail on its contention that the requirement for children’s social media accounts to not show the number of likes, shares, or other feedback on a post by default is unconstitutional. It said the like-count requirement is content-based and subject to strict scrutiny.
Second Circuit Splits with Fifth Circuit to Uphold FCC Fine: A three-judge panel in the Second Circuit unanimously upheld the Federal Communications Commission’s (“FCC”) $46.9 million fine against Verizon Communications Inc. for misuse of device-location data. The Second Circuit held that device-location data qualifies as customer proprietary network information under section 222 of the Communications Act, rejecting Verizon’s argument that only call-location data is covered. It found that the statutory language and legislative history support broad protection for location data obtained through the carrier-customer relationship. The panel rejected Verizon’s arguments that the data falls outside federal privacy protections and that such a penalty without a jury trial is unconstitutional, creating an apparent split with the Fifth Circuit’s recent dismissal of a $57 million fine against AT&T based on the Supreme Court’s 2024 decision in Securities and Exchange Commission v. Jarkesy.
U.S. ENFORCEMENT
CPPA Issues Record CCPA Fine: The CPPA has fined Tractor Supply Company $1.35 million for violating the California Consumer Privacy Act (“CCPA”). The fine is the largest CCPA fine ever levied. The enforcement action followed a consumer complaint and revealed multiple alleged violations, including failure to maintain a compliant privacy policy, neglecting to inform California job applicants of their privacy rights, lacking effective opt-out mechanisms for data sharing (including Global Privacy Control signals), and disclosing personal data without proper contractual safeguards. The enforcement action marks the first CPPA decision addressing job applicant privacy rights. Tractor Supply Company agreed to implement remedial measures such as scanning its digital properties for tracking technologies and certifying compliance annually for four years.
FTC Settles with Entertainment Company over Alleged COPPA Violations: Disney Worldwide Services, Inc. and Disney Entertainment Operations LLC (collectively, “Disney”) will pay the FTC $10 million to settle allegations that it violated COPPA by collecting personal data from children who viewed child-directed videos on YouTube without notifying parents or obtaining their consent. The FTC alleged that Disney failed to properly label some videos that it uploaded to YouTube as “Made for Kids.” The mislabeling allowed Disney, through YouTube, to collect personal data from children under 13 viewing child-directed videos and use that data for targeted advertising to children. This mislabeling also exposed children to age-inappropriate YouTube features. Under the settlement, Disney must comply with COPPA, notify parents before collecting children’s data, obtain verifiable parental consent, and implement a video-review program to ensure proper audience designation.
University Settles Whistleblower Lawsuit Alleging Violations of Federal Contractor Cybersecurity Standards: The Georgia Tech Research Corporation (“GTRC”) has agreed to pay $875,000 to settle a whistleblower lawsuit alleging violations of federal cybersecurity standards in defense contracts. The suit, initiated by former cybersecurity staffers, claimed GTRC failed to use proper antivirus software and submitted false cybersecurity scores to the DoD. The Department of Justice intervened, citing two contracts worth $31.2 million as sources of false claims. GTRC denied wrongdoing, asserting strong compliance efforts and criticizing the government’s “rush to judgment.” Despite the allegations, the DoD continued awarding contracts to GTRC’s Astrolavos Lab. The settlement avoids litigation without admission of liability. The whistleblowers will receive over $200,000.
FTC Settles with Chinese E-Commerce Platform over INFORM Consumers Act Violations: The FTC has settled with Whaleco, Inc., which operates the online marketplace Temu, for alleged violations of the INFORM Consumers Act (“INFORM Act”). This is the first action to enforce the INFORM Act. The INFORM Act requires online marketplaces to provide consumers with clear information about high-volume third-party sellers and accessible tools to report suspicious activity, such as counterfeit or unsafe goods. The FTC’s complaint alleges that Temu failed to clearly and conspicuously disclose required seller information, such as names, addresses, and contact details, especially in its gamified product listings and mobile website. Additionally, Temu did not provide a telephonic reporting mechanism for suspicious activity, as mandated by law. The proposed settlement requires Temu to implement compliant reporting systems and make all required disclosures easily accessible across all platform versions. The proposed settlement also requires Temu to pay a two-million-dollar civil penalty.
FTC Settles with Robot Toy Maker over Allegations of Wrongful Collection of Children’s Location Data: The FTC entered into a stipulated settlement order with Apitor Technology, a China-based manufacturer of programmable robot toys for children ages six to 14, for violating COPPA. Apitor’s companion mobile app, required to operate the toys, integrated a third-party software development kit called JPush, which enabled the collection of precise geolocation data from child users without parental consent. The FTC found that this data was used for multiple purposes, including targeted advertising, and that the practices had been ongoing since at least 2022. To settle the matter, Apitor agreed to (1) ensure that all third-party software integrated or necessary for the functionality of its toys complies with COPPA; (2) notify parents and obtain verifiable consent before collecting children’s data; (3) delete previously collected children’s data at parental request and only retain data as long as necessary for the purposes of collection; and (4) pay a $500,000 penalty.
SEC Brings Suit Against Investment Adviser and Firm: The U.S. Securities and Exchange Commission (“SEC”) filed a lawsuit against an investment adviser representative and his firm for violations of Regulation S-P. The SEC alleged that the investment adviser, among other things, e-mailed himself confidential client information from his former firm to solicit clients to start his own firm. The information included names, account values, and billing information. Rule 10 of Regulation S-P prohibits investment advisers from disclosing nonpublic personal information about a consumer to a nonaffiliated third party unless certain conditions are met. The SEC further alleged that the investment adviser reallocated at least 109 client accounts into an equity strategy that his former firm’s investment committee had not approved without notice and changed 26 clients from a previously approved strategy to a different not-approved strategy. The investment adviser also placed at least one client in investments that were contrary to such client’s instructions, breaching his fiduciary duty.
Oklahoma Attorney General Seeks Proposal for Legal Action Against Chinese E-Commerce Platform over Data Privacy and Security Concerns: The Oklahoma Attorney General issued a request for proposals for outside counsel to investigate and pursue legal action against Temu amid concerns of consumer protection violations. Recently, privacy and data security concerns have been raised over Temu, with Apple and Google suspending the app from its stores due to the misrepresentations Temu made about the types of data the app can access or collect from users, how it does so, and for what purposes it uses the data. The Oklahoma Attorney General alleges that Temu, once downloaded, can secretly install malware that bypasses device security and grants the app unrestricted access to a user’s device. The Oklahoma Attorney General also alleges that, given Temu is a Chinese e-commerce platform, the app is subject to the laws of China, which requires Temu to provide user data to the Chinese government upon request. Proposals were due October 3, 2025.
INTERNATIONAL LAWS & REGULATION
European General Court Upholds Validity of EU-U.S. Data Privacy Framework: The European General Court upheld the validity of the EU-U.S. Data Privacy Framework (“DPF”), dismissing a legal challenge by French MP Philippe Latombe. The Court affirmed the European Commission’s 2023 adequacy decision, supporting the DPF’s safeguards, including the U.S. Data Protection Review Court, and limits on bulk data collection. The DPF is used by more than 3,400 entities. The ruling could prompt an appeal to the European Court of Justice, potentially reigniting scrutiny of U.S. surveillance practices.
CJEU Issues Judgment on Scope of Personal Data: The Court of Justice of the European Union (“CJEU”) issued a ruling refining the scope of “personal data” and pseudonymization under EU data protection law. The case centered on the Single Resolution Board’s transfer of pseudonymized shareholder comments to Deloitte. Although Deloitte lacked access to the re-identification database, the CJEU held that the comments still constituted personal data. The CJEU reaffirmed that under the GDPR, personal data includes any information relating to an identified or identifiable natural person—defined broadly to encompass subjective content like opinions, which inherently reflect the author’s identity. Pseudonymized data remains personal if the controller or recipient can reasonably re-identify the subject using available means, considering cost, time, and technology. Importantly, the CJEU ruled that data controllers must inform subjects of third-party recipients at the time of collection, regardless of pseudonymization. This decision underscores the GDPR’s emphasis on transparency and reinforces that pseudonymization does not exempt controllers from notification obligations. For more information on the CJEU decision, see our client alert on the topic.
Ontario Information and Privacy Commissioner Issues First Administrative Penalty Under Health Privacy Law: The Office of the Information and Privacy Commissioner of Ontario has issued an administrative monetary penalty against a doctor and a private clinic under the Ontario Personal Health Information Protection Act (“PHIPA”). This is the first administrative monetary penalty issued by a privacy commissioner in Canada. The doctor has been ordered to pay a $5,000 penalty for accessing and using patients’ hospital records without authorization for personal financial gain. The clinic has been ordered to pay a penalty of $7,500 for failing to meet its most basic obligations under PHIPA.
China Penalizes Company for Personal Information Protection Law Violations: Chinese regulators penalized Dior’s Shanghai subsidiary for violating the Personal Information Protection Law, citing unauthorized cross-border data transfers, inadequate user consent, and weak technical safeguards. Among other things, the Cyberspace Administration of China (“CAC”) alleged Dior transferred customer data to France without using one of three approved mechanisms for cross-border transfers: CAC security assessments, certifications, or standard contracts. The CAC also found that Dior failed to implement appropriate consent for overseas transfers.
China Releases Artificial Intelligence Safety Governance Framework 2.0: China released the Artificial Intelligence Safety Governance Framework 2.0, expanding on the 2024 version to address emerging risks and strengthen oversight. Issued by the CAC, the framework introduces graded risk classifications, technical safeguards, and ethical requirements into technical standards. It provides more detailed guidance on classification of AI safety risks, technological countermeasures to address risks, and governance measures. The framework is intended to position the country as a leader in shaping global AI governance norms.
[View source.]