The Washington Privacy Act Is Back

BakerHostetler
Contact

BakerHostetler

After the Washington Privacy Act (“WPA”) failed to pass in 2019, state legislators promised to renew their efforts in the 2020 legislative session. Lawmakers kept this promise last month, introducing three bills targeted at an array of consumer privacy issues. The first bill, SB 6281, or the Washington Privacy Act, introduced in the Senate on January 14, is a comprehensive privacy bill modeled after the European Union’s General Data Protection Regulation (“GDPR”) with aspects of the California Consumer Privacy Act (“CCPA”) sprinkled in. The second bill, HB 2485, introduced on January 15 in the House, would regulate data collection and use practices of consumer genetic-testing companies. HB 2644, introduced a day later in the House, seeks to regulate the use of artificial intelligence-enabled profiling. While HB 2485 and HB 2644 target discrete privacy issues, SB 6281 attempts to set general guardrails for the permissible collection, use and disclosure of Washington residents’ personal data. Here, we delve into the details of the first and most comprehensive privacy bill introduced yet this legislative session: SB 6281.

By far the most ambitious bill introduced, the WPA attempts to establish a comprehensive set of standards for the collection, use and disclosure of consumer data in Washington state. The bill made its way quickly through the Senate, passing in that chamber on February 18 and demonstrating lawmakers’ motivation to enact robust privacy protections for the state. But the House committee considering the bill adopted several important amendments before passing the bill out of committee on February 28, leaving us with with competing versions that may force the two chambers to reconcile critical issues around exemptions for smaller businesses and enforcement.

The Basics

The WPA borrows many of its core terms and definitions directly from the GDPR and CCPA, beginning with its jurisdictional scope. The bill applies to entities that “conduct business in Washington” or “produce products or services that are targeted to residents of Washington,” and that meet one of the following thresholds:

  1. Control or process personal data regarding 100,000 or more consumers during a calendar year; or
  2. Derive over 50% of gross revenue from the sale of personal data and process or control personal data of 25,000 or more consumers.

The House’s amended version lowers the revenue threshold from 50% to 25%, reflecting the House committee’s intent to exempt fewer small organizations from the law’s scope.

The first requirement loosely tracks the GDPR’s “establishment” and “targeting” prongs under Article 3, while the second requirement mimics the CCPA’s processing and revenue thresholds that aim to keep small businesses out of the scope. The bill does not define the terms “conduct[ing] business in Washington” and “produc[ing] products or services that are targeted to residents of Washington,” and the terms are likely to generate confusion and jurisdictional challenges. At a minimum, the legislature should refine the targeting prong to remove the passive voice and clarify who must be targeting the Washington residents. This is an area where attempting to mirror the GDPR’s language might not translate well to U.S. state and federal laws.

The bill defines personal data broadly as “any information that is linked or reasonably linkable to an identified or identifiable natural person.” Under the draft legislation, “process” or “processing” is defined as “any operation or set of operations which are performed on personal data or on sets of personal data, whether or not by automated means, such as the collection, use, storage, disclosure, analysis, deletion, or modification of personal data.”

The bill does not apply to state and local governments, tribes, and municipal corporations; certain health information regulated by state and federal laws, including the Health Insurance Portability and Accountability Act; information regulated by the Gramm-Leach-Bliley Act, Fair Credit Reporting Act or Family Educational Rights and Privacy Act; and data maintained “for employment records purposes.” Additionally, the bill excludes from the definition of “personal data” deidentified and publicly available information. Critics of the bill have taken issue with the long list of exempt data and the House is now considering whether to accept or modify these exceptions.

Obligations for Controllers and Processors

The WPA imports the GDPR’s concept of data “controllers” and “processors.” A “controller” is “the natural or legal person which, alone or jointly with others, determines the purposes and means of the processing of personal data.” A “processor” processes data on behalf of the controller.

The bill imposes most processing responsibilities on controllers, but would require controllers to enter into agreements with processors, setting forth the “processing instructions to which the processor is bound, including the nature and purpose of the processing, the type of personal data subject to the processing, the duration of the processing, and the obligations and rights of both parties.” If the parties fail to enter into such an agreement, the processor fails to abide by the terms of the agreement or the processor begins to determine the “purposes and means of the processing,” the processor would become a controller subject to all controller obligations under the act. To demonstrate compliance with the controller’s instructions and the secure processing of the controller’s data, processors would be required to either engage a third-party auditor to conduct an audit of the processor’s security controls or submit to a security audit conducted by the controller.

Notice Requirements

Under this bill, Washington would join states like California, Nevada and Delaware in requiring certain content in consumer privacy notices. The bill would require companies to include the following in their privacy notices:

  1. The categories of personal data processed;
  2. The purposes for the processing;
  3. How and where consumers may exercise their rights under the WPA, including how to appeal a controller’s determinations with respect to action on a particular request;
  4. The categories of personal data shared with third parties;
  5. The categories of third parties with which personal data is shared; and
  6. The details of any data sales or processing for targeted advertising, including how consumers may opt out of such processing.

The bill requires controllers to process data only for purposes disclosed in their privacy notices. It also prohibits any secondary uses of personal data unless the secondary use is necessary to or compatible with the processing purposes specified in the controller’s privacy notice.

Consumer Rights

The bill includes data subject rights similar to the GDPR, including rights to access, correction, deletion and data portability, and the right to opt out of processing related to targeted advertising or certain profiling activities. Like the CCPA, the WPA would enable individuals to opt out of data sales.

Controllers must respond to consumer requests within 45 days, which may be extended by 45 additional days where necessary, “taking into account the complexity and number of the requests.” However, before granting these rights, controllers may require consumers to be “authenticated,” meaning the entity responding to the request may verify that the individual is entitled to exercise the right. Though not stated explicitly, this likely means that responding entities may verify the individual’s identity and Washington residence. Controllers may not require that consumers set up accounts to exercise their rights, but may require consumers with existing accounts to exercise their rights through such accounts. Consumers may exercise their rights up to twice annually.

Data Protection Assessments

The bill requires controllers to conduct data protection assessments when:

  1. Processing data for purposes of targeted advertising;
  2. Selling personal data;
  3. Processing personal data for purposes of profiling when the profiling presents a reasonably foreseeable risk of unfair or deceptive treatment of, or disparate impact on, consumers; financial, physical or reputational injury to consumers; “physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns of consumers, where such intrusion would be offensive to a reasonable person”; or where the processing would create a substantial injury to consumers;
  4. Processing sensitive data; and
  5. Processing that presents a heightened risk of harm to consumers.

Like data protection impact assessments under the GDPR, the assessments contemplated would be designed to evaluate the potential risk to consumers posed by the processing activity and whether the potential harm to consumers outweighs the potential benefits of the processing activity. Benefits to the controller, consumer, other stakeholders and the general public may be considered in evaluating the potential privacy risks and benefits. Controllers would be required to make data protection assessments available to the attorney general upon request.

Notably, the bill also imports the GDPR’s concept of “sensitive data,” defined as “(a) personal data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sexual orientation, or citizenship or immigration status; (b) the processing of genetic or biometric data for the purpose of uniquely identifying a natural person; (c) the personal data from a known child; or (d) specific geolocation data.” When conducting the assessments, data processing should be presumed permissible unless the processing involves sensitive data or the risk to consumers “cannot be reduced by appropriate administrative and technical safeguards.” Controllers may process sensitive consumer data only with the consent of the consumer (or parent, for consumers under 13).

Data Security

Like the CCPA and GDPR, the bill requires companies to establish “reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data.” The legislation would permit flexibility in determining the security controls to adopt, providing that the controls implemented should be appropriate for the nature and volume of the personal data processed.

 Restrictions on Facial Recognition Technology

The WPA’s most notable deviation from both the GDPR and CCPA arises from its focus on facial recognition technology. The legislation reflects a concern for potential bias in the application of facial recognition technology and requires processors that provide facial recognition services to “make available an application programming interface or other technical capability, chosen by the processor, to enable controllers or third parties to conduct legitimate, independent, and reasonable tests of those facial recognition services for accuracy and unfair performance differences across distinct subpopulations.” Such subpopulations may be defined by “race, skin tone, ethnicity, gender, age, disability status, or other protected characteristic that is objectively determinable or self-identified by the individuals portrayed in in the dataset” (emphasis added). If an individual or entity testing the facial recognition technology identifies “material unfair performance differences across subpopulations,” the results are disclosed to the processor, and if the processor determines the methodology and results are valid, “the processor must develop and implement a plan to address the identified performance differences.” Processors are required to provide disclosures that advise of the capabilities and limitations of the service and enable testing of the technology, as described above.

Additionally, the bill would require providers of facial recognition technology to provide “a conspicuous and contextually appropriate notice” where facial recognition technology is deployed in public. The notice must disclose:

  1. The purpose or purposes for which the facial recognition service is deployed; and
  2. Information regarding where consumers can obtain additional information about the facial recognition service including, but not limited to, any applicable online notice, terms or policy that provides information about where and how consumers can exercise any rights they have with respect to the facial recognition service.

Under the legislation, controllers would also be required to obtain consent from consumers before enrolling “an image of that consumer in a facial recognition service used in a physical premises open to the public.” However, a controller may do so if it “holds a reasonable suspicion, based on a specific incident, that the consumer has engaged in criminal activity[.]” The legislation also sets forth various safeguards that would be required before a controller would be able to utilize a facial recognition service for safety and security purposes without consent. Under the bill, controllers would be required to:

  1. Limit the use of the facial recognition service to safety or security purposes and maintain the facial recognition database separately from other databases;
  2. Conduct a biannual review of the facial recognition database to remove facial templates of consumers about whom the controller no longer has a reasonable suspicion that they have engaged in criminal activity or that are more than three years old; and
  3. Maintain a process by which the consumer can challenge decisions to enroll the image of the consumer in a facial recognition service for security or safety purposes.

Controllers would also be required to test the facial recognition software in operational circumstances and to train employees. Additionally, if a controller uses a facial recognition service to make decisions that produce legal or similarly significant effects on consumers, the controller must maintain a process for meaningful human review of such decisions.

Finally, the legislation would prohibit controllers from sharing images collected via a facial recognition service with law enforcement without the consent of the consumer unless the disclosure is made pursuant to a valid warrant, subpoena or court order; under emergency circumstances; or to the National Center for Missing & Exploited Children.

The facial recognition section has been widely criticized by the bill’s opponents, who argue that it does not go far enough to restrict the technology’s use and does not adequately address the technology’s potential bias. Most critics proposed simply removing the facial recognition section from the bill to allow lawmakers to address facial recognition technology in a separate law. The facial recognition section escaped from the House committee largely untouched, but with one major change: the House’s version would no longer preempt local laws or regulations concerning facial recognition, leaving cities like Seattle free to enact more restrictive provisions.

Enforcement

Under the Senate’s version, the attorney general would have exclusive authority to enforce violations of the act. Although the House version does not create an explicit private right of action (or statutory damages), it makes two critical changes the two chambers must now resolve: First, the House’s version removed language giving the attorney general exclusive enforcement authority. Second, it makes violations of the law unfair or deceptive practices under Washington’s Consumer Protection Act (CPA). These changes—requested by the Washington State Attorney General’s Office—would give the attorney general additional enforcement tools (like civil investigative demand authority) and open the door to private enforcement actions under the CPA.

Conclusion

While organizations with an existing GDPR or CCPA compliance program will have a good head start on complying with the WPA, should it become law, the legislation as proposed contains unique elements distinct from both laws. Like both the GDPR and CCPA, the WPA would require controllers to enter into agreements with data processors setting forth the anticipated details of the processing activities contemplated and restricting the processor’s use of the data. The notice and data subject rights also incorporate aspects of both the GDPR and CCPA. However, neither the CCPA nor the GDPR explicitly require processors to either submit to audits conducted by the data controller or engage a third-party auditor to ensure the security of data processing.

The law also reflects an expanded view of data protection that stretches beyond the fraud and identity theft focus of most U.S. data protection laws. The WPA would be the first state privacy law to incorporate the GDPR’s concept of “sensitive data,” placing heightened consent and assessment requirements on processing special data categories such as racial origin and religious belief. This thread runs through the legislation’s facial recognition section as well. Just as the GDPR restricts automated decision-making, the WPA would restrict use of facial recognition technology to make decisions producing legal or similarly significant effects on individuals. Like the GDPR, the WPA assumes that data controllers may use powerful data processing technologies to make decisions about consumers that have a significant impact on their lives. If passed as contained in the current Senate and House versions, the WPA would make Washington the first state in the country to directly tackle issues of bias and discrimination in technology.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© BakerHostetler | Attorney Advertising

Written by:

BakerHostetler
Contact
more
less

BakerHostetler on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide