Right To Know - November 2023, Vol. 11

Clark Hill PLC
Contact

Cyber, Privacy, and Technology Report

Welcome to your monthly rundown of all things cyber, privacy, and technology, where we highlight all the happenings you may have missed.

State Actions:  

  • NYDFS adopts new Cybersecurity Regulation on November 1, 2023: New York Department of Financial Services (NYDFS) adopted amendments to its Cybersecurity Regulation on November 1, 2023. The final amendments are available here: NYDFS Cybersecurity Regulation – Final Amendments. NYDFS also provided information on the implementation timeline. Cybersecurity Resource Center | Department of Financial Services (ny.gov) Finally, NYDFS is providing training on the amendments through a series of webinars offered in November and December.
  • California Approves Assembly Bill 254 to Protect Medical Information Collected Through Reproductive and Sexual Health Applications: On September 27, 2023, California governor Gavin Newsom approved Assembly Bill 254 (the “Bill”), to ensure that patient information collected through reproductive or sexual health applications enjoys protections under the Confidentiality of Medical Information Act (the “CMIA”). The Bill is set to expand the CMIA’s scope by revising its definition of “medical information” to pull in “reproductive or sexual health application information” which will include “information about a consumer’s reproductive health, menstrual cycle, fertility, pregnancy, pregnancy outcome, plans to conceive, or type of sexual activity collected by a reproductive or sexual health digital service, including, but not limited to, information from which one can infer someone’s pregnancy status, menstrual cycle, fertility, hormone levels, birth control use, sexual activity, or gender identity.” The Bill’s definition of “reproductive or sexual health application information” is broad enough to capture applications which provide general healthcare services that happen to overlap with the reproductive and sexual health spaces.
  • California’s Two New Privacy Assembly Bills AB 1194 and AB 947 Amend the California Consumer Privacy Act of 2018 and the California Privacy Rights Act: On October 8, California governor Gavin Newsom signed two bills, AB 947amending the California Consumer Privacy Act of 2018, and AB 1194 amending the California Privacy Rights Act (CPRA) of 2020. AB 947, which becomes effective as of January 1, 2024, amends the definition of “sensitive personal information” to include any personal information that reveals a consumer’s citizenship or immigration status. AB 1194, effective on July 1, 2024, will ensure that when a consumer’s personal information relates to “accessing, procuring, or searching for services regarding contraception, pregnancy care, and perinatal care, including, but not limited to, abortion services,” business are obligated to comply with CPRA, except in cases where the information is in an aggregated, deidentified form and is not sold or shared.
  • Florida Bar to Publish Advisory Opinion on Use of Artificial Intelligence: The Florida Bar is currently seeking comments from lawyers in order to formulate an ethical advisory opinion on a lawyer’s use of artificial intelligence tools in connection with providing legal services. Specifically, the Bar is looking for comment on (a) whether consent should be required, (b) a lawyer’s supervisory responsibility over artificial intelligence tools, (c) ethical limits on fees when artificial intelligence tools are used, (d) allowable advertising regarding the quality of a lawyer’s private/in-house artificial intelligence tools, and (e) a lawyer’s ability to counsel clients to create and rely on due diligence reports generated solely by artificial intelligence tools.  Florida is one of several states considering such regulation.

Regulatory

  • White House Issues EO on AI: On October 31, 2023, President Joe Biden issued a “sweeping” executive order calling for “safe, secure, and trustworthy development of AI intelligence.” According to the EO on AI Fact Sheet, the EO establishes new standards for AI safety and security that protect the privacy of Americans, particularly children, and seeks to advance equity and civil rights, promote innovation and competition, advance American leadership around the world, amongst other goals. In the EO, President Biden also called on Congress to pass various bipartisan proposals that protect Americans, particularly children, data privacy. The EO has eight main principles:
    • Artificial Intelligence must be safe and secure.
    • Promoting responsible innovation, competition, and collaboration will allow the United States to lead in AI and unlock the technology’s potential to solve some of society’s most difficult challenges.
    • The responsible development and use of AI require a commitment to supporting American workers.
    • Artificial Intelligence policies must be consistent with the Administration’s dedication to advancing equity and civil rights.
    • The interests of Americans who increasingly use, interact with, or purchase AI and AI-enabled products in their daily lives must be protected.
    • Americans’ privacy and civil liberties must be protected as AI continues advancing.
    • It is important to manage the risks from the Federal Government’s own use of AI and increase its internal capacity to regulate, govern, and support responsible use of AI to deliver better results for Americans.
    • The Federal Government should lead the way to global societal, economic, and technological progress, as the United States has in previous eras of disruptive innovation and change.
  • NIST AI Safety Institute Consortium: The Biden Administration has already taken steps to effectuate certain items related to the EO. The NIST has issued a call for Letters of Interest from private and public sector stakeholders to be part of an Artificial Intelligence Safety Consortium. The Consortium aims to create a lasting joint research and development effort, which will inform NIST’s AI standards. Consortium members will be expected to contribute technical expertise in one or more areas, including but not limited to AI safety, red-teaming, explainability and workforce skills.
  • Consumer Finance Protection Bureau (CFPB) Rulemaking on Consumer Data: On October 18, 2023, the Consumer Financial Protection Bureau (CFPB) released a notice of proposed rulemaking (NPRM) restricting how financial institutions handle consumer data. The “Personal Finance Data Rule” proposes to allow consumers the right to control their data, including allowing consumers to switch providers more easily and more conveniently manage accounts from multiple providers. CFPB anticipates that the rule will accelerate a shift toward open banking, where consumers would have control over data about their financial lives and would gain new protections against companies misusing their data. Comments on the proposed rule are due by Dec. 29, 2023.
  • FTC Finalizes Amendment to Safeguards Rule: On October 27, 2023, the Federal Trade Commission (FTC), by a 3-0 Commission vote, finalized an amendment to the Safeguards rule that would require non-banking institutions to report data breaches and other security-related events to the agency. The Safeguards Rule requires non-banking financial institutions to develop, implement and maintain a comprehensive security program to keep their customers’ information safe. The amendment requires financial institutions to notify the FTC as soon as possible and no later than 30 days after discovery of a security breach involving the information of at least 500 consumers. The notice must include certain information including the number of consumers affected or potentially affected. The rule takes effect 180 days after its publication in the Federal Register.
  • The FDA Announces the Creation of the Digital Health Advisory Committee To Advise on Issues Related to Digital Health Technologies: On October 11, 2023, the U.S. Food and Drug Administration (FDA) announced the creation of a new Digital Health Advisory Committee to provide guidance to the FDA on issues relating to digital health technologies (“DHTs”), including artificial intelligence, wearables, virtual reality, and remote patient monitoring. The Digital Health Advisory Committee aims to begin operating in 2024 and will consist of individuals with technical and scientific expertise from diverse disciplines and backgrounds. They will advise the FDA on issues related to DHTs, providing relevant expertise and perspective to help improve the FDA’s understanding of the benefits, risks, and clinical outcomes associated with use of DHTs.

Litigation & Enforcement: 

  • U.S. Supreme Court Take Ups Challenges to State Social Media Laws: The United States Supreme Court granted certiorari in a pair of cases that address state regulation of social media.  The cases arise out of legislation enacted by Texas and Florida that seek to, among other things, limit social media platforms’ ability to moderate content and require the platforms to explain moderation decisions.  Specifically, the Court will address two questions: (1) Whether the laws’ content-moderation restrictions comply with the First Amendment; and (2) Whether the laws’ individualized-explanation requirements comply with the First Amendment.  The cases will likely be argued in tandem in early 2024.
  • SEC Charges SolarWinds and its Executive With Reporting Fraud: The SEC has charged SolarWinds, a Texas-based software company, and its executives, for fraud and internal controls failure relating to allegedly known cybersecurity risks and vulnerabilities. The SEC alleges that between October 2018 to December 2020, SolarWinds misled investors by overstating its cybersecurity practices and failing to disclose the known risks. SolarWinds knew of the specific deficiencies in its cybersecurity practices and failed to disclose those deficiencies. The SEC’s complaint alleges that SolarWinds and Brown violated the antifraud provisions of the Securities Act of 1933 and of the Securities Exchange Act of 1934; SolarWinds violated reporting and internal controls provisions of the Exchange Act; and the SolarWinds CISO aided and abetted the company’s violations.
  • Healthcare Clearinghouse Settles With A Coalition Of 33 State Attorneys General For $1.4M Over Data Breach: On October 17, the healthcare clearinghouse Inmediata reached a $1.4 million settlement with a coalition of 33 state attorneys general for a 2019 data breach that exposed the protected health information of approximately 1.5 million Americans for almost three years. As a health care clearinghouse, Inmediata facilitated transactions between health care providers and insurers. On January 15, 2019, the U.S. Department of Health and Human Services alerted Inmediata that personal health information maintained by the company was available through search engines, which appeared to be the result of a coding error by the company. As a result, it was alleged that sensitive patient information could be viewed through online searches, and potentially downloaded by anyone with access to an internet search engine. Under the settlement, in addition to monetary payment, the company agreed to overhaul its data security and breach notification practices.
  • HHS OCR Reaches Settlement with Covered Entity Over Data Breach Resulting from Ransomware Attack: The Department of Health and Human Services Office of Civil Rights (“OCR”), the agency in charge of enforcement of the Health Insurance Portability and Accountability Act (“HIPAA”), announced a settlement with Doctor’s Management Services (“DMS”), a Massachusetts medical management company, that suffered a breach in 2017 as a result of a ransomware attack. The settlement marks the first ransomware agreement OCR has reached with an entity. OCR settled the case for $100,000 under the terms of a resolution agreement that will require DMS to improve its cybersecurity program to address risks associated with the ransomware attack that affected approximately 206,695 individuals. In its press release OCR noted that it is seeing a 239% increase in large data breaches and a 278% increase in ransomware over the past four years. This may signal a new focus of OCR investigations as they work to combat this type of threat to the healthcare industry.

International Updates:

  • United States- United Kingdom Data Bridge Goes into Effect: The United States-United Kingdom Data Bridge went into effect on Oct. 12, 2023, following the U.K. publishing its Data Protection (Adequacy) (United States of America) Regulations 2023. The U.S.-U.K. Data Bridge is an extension of the EU-U.S. Data Privacy Framework. Specifically, when personal data is transferred to entities in the U.S. that are part of the U.K. extension to the EU-U.S. Data Privacy Framework and follow its principles, additional approval is not needed. The FTC and the U.S. Department of Transportation (DOT) have been assigned as the independent supervisory authorities for the U.K. Extension to the EU-U.S. Data Privacy Framework.
  • United States Acknowledges Use of Artificial Intelligence by North Korea and Other Nation States in Cyber Kill Chain: The Deputy National Security Advisor for Cyber and Emerging Technologies on the National Security Council, Anne Neuberger, acknowledged during a press briefing that North Korea and other “nation-state criminal actors” are incorporating artificial intelligence into their writing of malicious code and search for vulnerable networks. North Korea differs from other nation-states in that it leverages its cyber program as a key revenue source, which allows it to weather sanctions. Last year alone, South Korea estimates that North Korea was able to steal $700 million dollars’ worth of cryptocurrency. The use of Artificial Intelligence could enable North Korea to develop and exploit methods for stealing funds from individuals and businesses more rapidly.

Industry Updates:

  • Small Business Continues to Be Hot Target For Cyber Attacks: The Identity Theft Resource Center recently reported that 73% of U.S. small business owners reported a cyber-attack in 2022. The report compiled information from interviews of 551 small business owners and employees. Despite having been targeted often and reporting that 85% of respondents self-identified as “ready to respond to a cyber incident,” relatively few small businesses are adopting best-practices to help prevent cyber-attacks. The report found that only 20-34% of respondents had multi-factor authentication, mandatory strong passwords, and limits on access to personal information based on employee roles in place in their organization. The report also noted that while financial impacts from cyber-attacks were down slightly from the previous year, more respondents reported seeing other impacts from such cyber-attacks like a loss of trust (32%) and higher employee turnover (32%).

This publication is intended for general informational purposes only and does not constitute legal advice or a solicitation to provide legal services. The information in this publication is not intended to create, and receipt of it does not constitute, a lawyer-client relationship. Readers should not act upon this information without seeking professional legal counsel. The views and opinions expressed herein represent those of the individual author only and are not necessarily the views of Clark Hill PLC. Although we attempt to ensure that postings on our website are complete, accurate, and up to date, we assume no responsibility for their completeness, accuracy, or timeliness.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Clark Hill PLC | Attorney Advertising

Written by:

Clark Hill PLC
Contact
more
less

Clark Hill PLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide