New CCPA Risk Assessment and Automated Decision-making Technology Regulations: Maybe Not Quite as Bad as They Look?

Wyrick Robbins Yates & Ponton LLP
Contact

Wyrick Robbins Yates & Ponton LLP

As we recently discussed, CalPrivacy (the cool new name the California Privacy Protection Agency recently gave itself) adopted new CCPA regulations earlier this year. In September, CalPrivacy announced that the California Office of Administrative Law approved the regulations, thus clearing the last hurdle before those regulations can formally take effect on January 1, 2026.

As a follow-up to our prior post, which focused on the cybersecurity audit requirements of those regulations, here we discuss the new regulations’ requirements for risk assessments and automated decision-making technologies (“ADMT”) and when they might apply to businesses.

What types of processing will trigger the requirement to conduct a risk assessment?

Businesses will be required to conduct a risk assessment before engaging in personal information processing that “presents significant risk to consumers’ privacy.”

The regulations offer the following non-exhaustive list of examples of processing that pose a “significant risk” to privacy:

  1. “Selling,” or “sharing” personal information, as those terms are defined by the CCPA.
  2. Processing “sensitive personal information,” as defined by the CCPA, outside of the employment and human resources contexts.
  3. Using an ADMT to make a significant decision about a consumer.
  4. Using automated processing to infer or extrapolate certain characteristics of consumers based on systematic observation of those consumers when acting as an educational program applicant, job applicant, student, employee, or independent contractor for the business.
  5. Using automated processing to infer or extrapolate certain characteristics of consumers based on their presence in a sensitive location, such as a healthcare facility or domestic violence shelter.
  6. Processing personal information that the business intends to use to:
    • Train an ADMT for a significant decision concerning a consumer; or
    • Train a facial-recognition, emotion-recognition, or other technology that verifies a consumer’s identity, or conducts physical or biological identification or profiling of a consumer.

Can I just rely on my existing data protection assessments for other state laws?

Probably not. Businesses should not assume existing data protection assessment processes for other state laws will address the new CCPA regulations’ risk assessment requirements. Like other states, the regulations will require data protection assessments for activities like selling, targeted advertising, sensitive data processing, and certain automated processing technologies. The new CCPA regulations, however, introduce unique risk assessment triggers for AI and other technology training activities, and for specific contexts, such as presence in a sensitive location and systematic observation in the employment and educational contexts. The CCPA regulations also contain unique requirements for stakeholder involvement and topics to be covered in the assessment, which will also merit additional consideration.

Additionally, for assessments conducted from 2026 onward, businesses will be required to submit an attestation that the assessment was performed to CalPrivacy. All assessments conducted in 2026 and 2027 must be submitted by April 1, 2028. Businesses may also be required to produce “risk assessment reports” to CalPrivacy or the California Attorney General within 30 calendar days of either regulator’s request.

Those requirements could significantly increase the risk of enforcement for businesses. For example, CalPrivacy could target businesses with privacy policies stating they engage in activities that require a risk assessment under the new regulations (like “sharing” personal information for cross-context behavioral advertising), but who have not submitted a risk assessment attestation to CalPrivacy. And even when a business has submitted an attestation, CalPrivacy or the Attorney General may still seek to review risk assessment reports relating to processing activities or industries the regulators consider problematic.

What is an ADMT? How do I tell if my business is using one?

According to the new regulations, an ADMT is any technology that (1) processes personal information and (2) uses computation to replace human decisionmaking or substantially replace human decisionmaking.

“Replacing human decisionmaking” is not defined, but “substantially replace human decision-making” is defined as the business using the technology’s output without “human involvement.” “Human involvement” is defined as requiring a human reviewer to (1) know how to interpret and use the technology’s output to make a decision, (2) review and analyze the technology’s output and any other relevant information to make or change a decision, and (3) have the authority to make or change the decision based on their analysis.

The definition also excludes certain activities, such as web hosting, spellchecking, calculators, and spreadsheets, from the definition, but only if those activities “do not replace human decisionmaking.”

The definition raises several interpretative questions, including:

  • What distinguishes “substantially replace human decisionmaking” (which is defined as decisions without human involvement) from “replacing human decisionmaking” (which also indicates making decisions without human involvement)?
  • Does the word “replace” in the definition mean an ADMT is limited to scenarios where technology replaces decisionmaking previously made by humans, but not decisionmaking that was automated from inception?
  • The exception for activities like web hosting or spellchecking is limited to situations where those activities don’t replace human decisionmaking. However, an activity that does not replace human decisionmaking would not be an ADMT to begin with under the primary definition. So why the exception?

Oof, that is confusing. But in the absence of clarifying guidance, what should I focus on to determine whether my business is using an ADMT?

We suggest organizations worried about whether they’re using an ADMT focus on the use of technology that (1) operates without any human review or input and (2) makes “significant decisions” as defined by the regulations, which triggers key ADMT risk assessment and other notice and opt-out requirements.

Under the new regulations, a “significant decision” is one that results in provision or denial of the following:

  • Financial or lending services.
  • Housing.
  • Education enrollment or opportunities.
  • Employment or independent contracting or compensation.
  • Healthcare services.

Somewhat helpfully (especially for businesses involved in targeted advertising), the regulations also expressly exclude “advertising to a consumer” from the scope of significant decisions.

Is there a reason ADMT requirements may not be quite as burdensome as they initially seem?

Yes, several CCPA statutory exceptions mean the CCPA would not apply in various contexts in which a business might make a significant decision. For example:

  • The statute does not apply to personal information subject to the Gramm-Leach-Bliley Act (“GLBA”), which means the CCPA and ADMT regulations would not apply to “financial or lending services” decisions that are based on personal information subject to GLBA.
  • The statute excludes an “activity involving the collection, maintenance, disclosure, sale, communication, or use of any personal information bearing on a consumer’s creditworthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living by a consumer reporting agency” that is subject to the Fair Credit Reporting Act (“FCRA”) and conducted as authorized by the FCRA. At a minimum, the FCRA would apply to consumer reports and evaluation in the context of decisions to provide lending services, housing, and employment.
  • The CCPA statute does not apply to covered entities under HIPAA, providers of health care under the California Confidentiality of Medical Information Act (“CMIA”), protected health information under HIPAA, and medical information under the CMIA. Those exceptions remove many entities and decisions regarding “healthcare services” from the scope of “significant decisions” subject to the ADMT regulations.

Businesses should also consider incorporating human review into technology-assisted decision processes to avoid triggering the criteria that an ADMT replace or substantially replace human decisionmaking.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Wyrick Robbins Yates & Ponton LLP

Written by:

Wyrick Robbins Yates & Ponton LLP
Contact
more
less

What do you want from legal thought leadership?

Please take our short survey – your perspective helps to shape how firms create relevant, useful content that addresses your needs:

Wyrick Robbins Yates & Ponton LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide