Dechert Cyber Bits - Issue 17

Dechert LLP

California Privacy Protection Agency Proposes CPRA Regulations as the ADPPA Continues to Advance in Congress

On July 8, 2022, the California Privacy Protection Agency (“CPPA”) filed a Notice of Proposed Rulemaking Action, commencing the formal process to adopt its proposed regulations implementing the California Consumer Privacy Rights Act of 2020 (“CPRA”).

The proposed regulations are intended to update existing regulations under the California Consumer Privacy Act of 2018 (“CCPA”) and add new rules to implement and interpret changes to the CCPA made by the CPRA. The CPPA previously published the proposed regulations on its website in May 2022, and discussed them at a June 8, 2022 meeting.

The draft rules take broad liberties interpreting the CCPA and impose highly prescriptive obligations on companies, including limits on dark patterns; a right to correct (akin to the GDPR’s right to “rectification”); mandatory recognition of global opt-out signals for selling and sharing of personal information; new enforcement powers and mechanisms, including “probable cause proceedings”; and Agency audits. For a full discussion of the draft regulations, see Dechert’s recent OnPoint.

The CPPA’s notice starts the initial public comment period. Interested parties may submit comments in writing no later than August 23, 2022, or by speaking at the public hearings scheduled to be held on August 24 and 25, 2022.

The CPPA is already behind schedule in finalizing regulations to implement the CPRA, which by statute were required to be finalized by July 1, 2022. At the same time, the U.S. House of Representatives continues its consideration of a draft bipartisan federal privacy bill, the American Data Privacy and Protection Act. This Act, if adopted, would likely pre-empt most of the CCPA and CPRA. The CPPA wrote to House Speaker Nancy Pelosi on July 1, 2022, expressing its concerns about the pre-emptive effects of the bill’s provisions. In addition, California’s Attorney General, joined by other state Attorneys General, is urging Congress to adopt a federal law that would preserve the states’ ability to address emerging privacy concerns. The House Committee on Energy and Commerce has voted to advance the bill to the House floor.

Takeaway: The new proposed regulations take a broad view of California’s latest privacy statute and, if adopted, would impose further obligations on covered entities with respect to user rights and expand the Agency’s enforcement powers. But the statute (and ensuing regulations) could be pre-empted by pending federal legislation. Interested parties should watch both the CPPA’s rulemaking process and federal privacy bill closely. Assuming the CCPA is not pre-empted by a federal bill, business will likely have limited time to ensure compliance with new CCPA regulations before they take effect.

CFPB Issues Opinion on Permissible Uses of Consumer Credit and Background Reports

The Consumer Financial Protection Bureau (“CFPB”) has issued an advisory regarding furnishing, using, and obtaining consumer reports under the Fair Credit Reporting Act (“FCRA”). Consumer reports include not only credit reports, but criminal, employment, or rental histories prepared by consumer reporting agencies and commonly used by creditors, insurers, landlords, and employers. In a July 7, 2022 press release, the CFPB reiterated that reporting agencies and those who use such reports have specific obligations to protect the public’s data privacy and warned that it would be taking steps to curb improper disclosures.

Under the FCRA, companies may not distribute or request a consumer report without a permissible purpose. The new advisory clarifies that companies must take reasonable steps to ensure they are only sharing or requesting data pertaining to the individual for whom they have a permissible purpose. Problematic practices specifically identified by the CFPB include:

  • Insufficient matching procedures that may pull information on incorrect or multiple individuals (the advisory specifically flagged “name-only” as one such insufficient procedure);
  • Providing reports on a possible match or matches instead of taking further steps to ensure the correct consumer has been identified before the report is produced; and
  • Requesting information on consumers without a permissible purpose, especially due to errors made by those placing the requests.

The CFPB noted that disclaimers about insufficient matching procedures will not cure a failure to have reasonable procedures in place to prevent impermissible disclosures. Finally, it reiterated that knowingly and willingly sharing or obtaining consumer information without a proper purpose could result in criminal liability under the FCRA.

Takeaway: The new guidance could signal greater scrutiny for companies who produce or use reports involving consumer data. Consumer reporting agencies may need to implement more robust matching procedures to ensure that only information pertaining to the correct consumer is disclosed. And users of consumer reports may benefit from additional precautions and training to prevent what could be increasingly costly erroneous requests.

Digital Regulation Incoming: EU Parliament Adopts DMA and DSA by Wide Margins

The Digital Markets Act (“DMA”) and the Digital Services Act (“DSA”) have both passed the EU legislative process. The political agreement had been reached on March 25, 2022 for the DMA (see Cyber Bits Issue 11) and on April 23, 2022 for the DSA (see Cyber Bits Issue 13). Most recently, on July 5, 2022, the DSA and the DMA were adopted by the European Parliament. The DMA was approved by the Council of the European Union on July 18, 2022, with approval for the DSA expected to follow in September 2022. Following approval, the Acts will be published in the Official Journal and enter into force 20 days after publication.

However, the DMA and DSA will not become applicable (i.e. enforceable) immediately. The DSA will be generally applicable fifteen months after its entry into force or from January 1, 2024, whichever is later. Certain entities designed as very large online platforms or search engines will have a shorter ramp-up as the DSA will apply to them four months after they have been designated as such by the European Commission. The DMA will be applicable six months after its entry into force, and “gatekeepers” (identified in terms of revenue and number of users, though smaller companies can be designated as such by the EU Commission) will have a maximum of six months after they have been designated to comply with the new obligations.

As a reminder, the DSA targets online intermediaries (such as online marketplaces, cloud companies, and large search engines), increasing their accountability to end users and supervisory authorities by addressing challenges like hate speech, cyber threats, the sale of fake products, and targeted advertising. Very large online platforms will have additional reporting and audit requirements.

While the DSA focuses on the relationship between services and their users, the DMA aims to govern competition between “gatekeeper” businesses that provide core platform services (such as online search engines, social networking services, and virtual assistants). The DMA contains a series of “do’s” and “don’ts” designed to prevent certain business practices and to protect smaller businesses.

Takeaway: It is still to be seen how these Acts, which have passed the legislative process at a high speed, will be implemented and enforced in practice. The uniform application of DSA and DMA across the EU member states will be essential and national legislators still have to amend national laws to enable uniform enforcement. Critics caution that current resourcing plans may be inadequate for effective enforcement. Even so, businesses active in the digital space, and particularly the Big Tech companies that are the primary targets of the digital package, should start looking now at the steps they will need to take for compliance.

U.S. GAO Advises that Federal Action May be Needed to Address Risk of “Catastrophic Financial Loss” from Cyberattacks

The U.S. Government Accountability Office issued a report warning that businesses in the infrastructure sector may soon face gaps in cyber insurance coverage, leading to a potential for “catastrophic financial loss.” The report recommended that the Federal Insurance Office and the Cybersecurity and Infrastructure Security Agency jointly assess the need for a federal response to address the situation.

While private cyber insurance generally covers common cyber risks such as data breaches and ransomware attacks, private insurers have been taking steps to limit potential losses from systemic cyber events by excluding or limiting coverage for losses from cyber warfare and infrastructure outages. While these attacks are frequently perpetrated by nation-state actors, such as hacking groups linked to Russia, China, Iran, and North Korea, the Terrorism Risk Insurance Program (“TRIP”) – the U.S. Government’s stop-gap measure for terrorism-related losses – may not cover such cyberattacks. TRIP covers losses from cyberattacks if they are considered “terrorism,” but the attacks must be violent or coercive in nature, among other criteria, to be certified as terrorism.

The rise in cyberattacks targeting critical infrastructure coupled with the reluctance of private insurance companies to provide comprehensive cyber insurance led to the Government Accountability Office’s suggestion that a federal insurance response, such as the FDIC insurance for bank deposits and the National Flood Insurance Program, should be considered.

Takeaway: Companies in critical infrastructure sectors, such as utilities or financial services, should monitor their cyber insurance policies and renewals carefully and prepare for the potential of reductions in coverage while shoring up cybersecurity defenses.

Independent Review of UK Legislation Says Country “urgently” Needs New Laws on Use of Biometrics

The report of a more than year-long review of UK legislation commissioned by the independent Ada Lovelace Institute (a research body promoting informed public understanding of the impact of AI) is calling on the UK to enact new laws on biometrics in order to protect the public.

The report recognizes that biometric data does not have a standardized definition, but is generally understood to comprise facial recognition, fingerprints, voice recognition, DNA profiles, iris scans and other bodily and behavioral measurements. Biometrics can be used to identify individuals, but also to categorize and draw conclusions about the behavior of groups.

The report made ten recommendations, including that the use of live facial recognition (“LFR”) in public should be suspended until an appropriate legal framework for its use is made fit for purpose. The use of LFR has made headlines in the UK in the last few years, particularly in the context of its use by law enforcement and in connection with Clearview AI, Inc. In 2020, the UK Court of Appeal, in a decision involving South Wales Police Force, did not rule out the use of LFR, but did find that the Force’s use had been unlawful. The UK Information Commissioner’s Office has also issued opinions raising concerns with the use of LFR in public places, both generally and in the law enforcement context. In May 2022, the UK Information Commissioner’s Office ordered Clearview to delete facial recognition data belonging to UK residents and pay a fine of £7.5 million.

The concerns with LFR are not limited to the UK. The report notes that California introduced a 3-year ban on the use of LFR by law enforcement agencies in 2019. In the EU, a non-binding resolution banning the use of LFR in public by police was approved by the European Parliament in October 2021. While the report made clear that it was not suggesting a permanent ban, it did not consider that the use of LFR could currently be deployed in a rights-compatible way.

In addition, the report recommends that any forthcoming legislation provide a new "technology-neutral" legal framework for the use of biometrics by public as well as private bodies, introducing codes of conduct setting out specific and detailed duties for the use of LFR by police forces, and establishing a national biometrics ethics board, which would have an advisory role for the public sector.

Takeaway: The report highlights some of the key privacy concerns around the use of biometric data, particularly in the context of LFR. However, the concerns raised in the context of LFR apply in the use of a range of biometric data. Businesses which make use of biometric data should keep a close eye on developments in this area, considering potential legislation in this area as well as reputational factors, as use of biometrics continues to generate headlines.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dechert LLP | Attorney Advertising

Written by:

Dechert LLP
Contact
more
less

Dechert LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide