Consumer-directed health apps are experiencing a boom thanks to COVID-19, as consumers seeking to avoid doctors’ office waiting rooms are increasingly relying on apps to measure and maintain their health. That trend is creating a wealth of new opportunities for app developers in the mobile health space.
But as a settlement announced on September 17 by the California Attorney General’s office shows, those opportunities can carry significant legal and regulatory risk
In that settlement, the developers of Glow, a popular mobile app that women can use to track ovulation and fertility, agreed to pay a $250,000 civil penalty based on the company’s failure to protect its users’ personal and health information. The settlement also imposes a rigorous set of injunctive terms on Glow that include some novel requirements designed to address the unique impacts that online privacy and data security lapses can have on women.
The AG called the settlement a “wake up call for every app maker that handles sensitive private data.” All consumer health app developers should heed that call: as the Glow settlement shows, the AG’s office has a formidable arsenal of regulatory tools that it can deploy against developers that fail to protect users’ health data from unauthorized access or disclosure.
The Consumer Mobile Health App Regulatory Landscape: A Complex Patchwork
As any privacy lawyer worth their salt will tell you, the United States has a patchwork system of laws and regulations that govern privacy and data security that can be challenging even for well-resourced companies to understand and comply with.
For organizations subject to HIPAA, things can be simpler. Covered entities and their business associates get a single, well-known set of federal standards for privacy, security, and breach notification, and an exemption from many (though not all) overlapping state privacy and breach notification laws.
But as HHS’ Office for Civil Rights’ Guidance on Health App Use Scenarios and HIPAA explains, mobile apps that consumers download and use to input and manage their own health information without the involvement of a health care provider generally are not subject to HIPAA.
Without a controlling federal law, developers of these apps find themselves firmly within the patchwork. That’s especially true in California, where they can be subject to laws that include the CCPA, CalOPPA, and California’s customer records law, which requires businesses to implement and maintain “reasonable security procedures and practices” for medical information they own, license, or maintain.
California: Where App Developers Can Become Health Care Providers—No Medical Degree Required
Consumer health app developers can also be subject to California’s Confidentiality of Medical Information Act (CMIA). The CMIA adds to the protections provided by HIPAA and generally prohibits a “provider of health care” from disclosing individual’s medical information without user’s authorization except under certain specified circumstances. The term “provider of health care” is defined to include state-licensed physicians, health care practitioners, and health care facilities.
Thanks to a 2013 amendment to the CMIA, however, the definition also includes “[a]ny business that offers software or hardware to consumers, including a mobile application or other related device that is designed to maintain medical information . . . in order to make the information available to an individual or a provider of health care at the request of the individual or a provider of health care, for purposes of allowing the individual to manage his or her information, or for the diagnosis, treatment, or management of a medical condition of the individual.”
The CMIA includes a private right of action with statutory damages of $1,000, and an enforcement provision that allows the AG’s office to impose a civil penalty of up to $2,500 per violation, for the negligent release of an individual’s medical information.
The Glow Fertility App’s “Serious Basic Security Failures”
In its complaint against Glow, the AG’s office alleged the company violated several of these laws by failing to properly protect users’ health information.
Glow, alleged the complaint, “collects and stores deeply-sensitive personal and medical information related to a user’s menstruation, sextual activity, and fertility,” including medications, ovulation-cycle calculations, complete medical records, and intimate details of their sexual experiences and efforts to become pregnant.
The AG alleged that Glow failed to protect this information in two critical ways.
- First, the app included a “partner connect” feature that allowed two users to link to each other and share information. According to the complaint, that feature would “automatically grant linking requests without any authorization or confirmation from the user who was about to have their information shared,” and without verifying the legitimacy of the person with whom the information was to be shared.
- Second, the app suffered from a password change vulnerability that allowed users to change the password associated with an account without confirming the user’s old password, so that “new passwords were always accepted and anyone could change a user’s password, log in with that new password, and access the user’s data.”
- Violating CalOPPA, which requires operators of online services to comply with the provisions of their posted privacy policies;
- Violating the customer records law by failing to implement and maintain reasonable security procedures; and
- Violating California’s False Advertising Law by making untrue or misleading statements about the design of the Glow app and the security measures used to protect consumers’ information.
The AG and Glow agreed to settle the AG’s charges. As part of that settlement, Glow agreed to pay a $250,000 civil penalty, and also agreed to a series of injunctive provisions that require Glow to remedy deficiencies in its privacy and data security program. To that end, Glow must:
- Implement, regularly review and revise, and comply with a documented information security program designed to protect the security, availability, and confidentiality of users’ information;
- Designate one or more individuals to oversee Glow’s adherence to applicable state and federal privacy laws, and ensure those individuals have the authority and autonomy to perform their responsibilities and to report “any significant privacy or security concerns” to the CEO or other executives;
- Obtain affirmative authorization from consumers before sharing personal information with any third party other than service providers unless required by law, or using personal information for any materially different purpose, and give consumers the right to revoke;
- Develop, implement and maintain a process to incorporate privacy-by-design principles and security-by-design principles when creating new applications or online services, and specifically consider how privacy or security lapses may impact online threats affecting women and online risks that women face;
- Provide employee training concerning awareness and prevention of online threats affecting women, including cyberstalking and online harassment, as well as privacy issues related to reproduction and reproductive rights; and
- Complete an annual privacy risk assessment that addresses Glow’s efforts to comply with applicable privacy laws, and considers online risks that women face as a result of privacy or security lapses while using Glow’s mobile apps or online services, and deliver a copy of the report of that assessment to the AG’s office.
The Glow case teaches several important lessons for the developers of consumer-directed mobile health apps.
First, although CCPA, and its progeny the California Privacy Rights Act, are currently the focus for most organizations that do business in California, the California AG has several other less famous, but no less dangerous tools to regulate health app developers’ privacy and data security practices.
Second, while consumer-directed health apps fall outside HIPAA’s strict regulatory requirements, those apps can still be covered by CMIA and its stringent restrictions on disclosures of individuals’ medical information, and their developers can face serious consequences for failing to comply.
Third, the injunctive provisions in the settlement show that the California AG expects organizations to implement various accountability measures that appear in other prominent comprehensive data protection laws such as GDPR. Those measures include implementing privacy-by-design, oversight by an individual with authority and autonomy to report violations to executive management (which recalls the GDPR’s Data Protection Officer role), and conducting privacy risk assessments (not unlike the privacy impact assessments that GDPR requires for certain processing operations).
Finally, the settlement is especially notable for its focus on the unique impacts that online privacy and security lapses can have on women. That focus suggests that the AG, besides targeting its privacy and data security enforcement efforts on data collected from children, is also concerned about other groups that can face disparate consequences from app developers’ failure to protect user privacy.