Articles in this issue
- FTC Finalizes Settlement with 1Health.io For Allegations It Failed to Protect Customers’ DNA Data
- FTC Fines Background Check Providers $5.8 Million For Alleged Violations of the FTC Act and FCRA
- UK Data Regulator Publishes "Toolkit" For Sharing Data With Law Enforcement Agencies
- California Legislature Passes the ‘Delete Act'
FTC Finalizes Settlement with 1Health.io For Allegations It Failed to Protect Customers’ DNA Data
In its complaint, the FTC alleged that 1Health.io made false claims about “the privacy and security of the sensitive health and genetic information it collects and maintains.” Under the finalized settlement, 1Health.io must pay a fine of $75,000, which the FTC intends to use for consumer refunds. The company also must implement remedial measures including, but not limited to, telling third-party contract laboratories to destroy all consumer DNA samples that had been retained for more than 180 days after the laboratory’s analysis, and establishing and implementing a comprehensive security program for the collection, maintenance, and use of personal information. 1Health.io also agreed to having the information security program assessed by an independent third-party within 180 days, and every two years thereafter for the next 20 years. (For more detailed description of the FTC’s allegations, please see our article in Issue 36 of Cyber Bits, “FTC Brings Privacy and Security Allegations Against Genetic Testing Firm 1health.io.”)
Takeaway: The FTC continues to closely scrutinize promises regarding the sharing and security of personal information, including genetic and biometric information. Companies will want to review their privacy policies to ensure that the policies accurately reflect their actions.
FTC Fines Background Check Providers $5.8 Million For Alleged Violations of the FTC Act and FCRA
On September 11, 2023, the FTC announced a proposed settlement of $5.8 million with affiliated companies Instant Checkmate, TruthFinder, The Control Group Media Company, Intelicare Direct, and PubRec (the “Companies”) for alleged violations of Section 5 of the FTC Act and the Fair Credit Reporting Act (the “FCRA”). If approved, the settlement would require the Companies to establish and implement a comprehensive monitoring program to determine the extent to which the Companies are operating as consumer reporting agencies (“CRAs”) and ensure compliance with the FCRA. The settlement also would prohibit the Companies from misrepresenting the accuracy of their reports or making any of the misrepresentations alleged in the government’s Complaint.
In its complaint, also dated September 11th, the FTC alleged that the Companies had been operating as CRA's while failing to meet requirements outlined in the FCRA. In support of its position, the FTC claimed that, while not directly advertising their products for use in employment or tenant screening, the Companies purchased Microsoft and Google Ads keywords that “implicate employee or tenant screening.” According to the complaint, the use of these keywords in internet searches such as “nanny background check” or “tenant background check” produced advertisements for Instant Checkmate and TruthFinder. The FTC further alleged that the Companies should have been on notice that they were operating as CRAs subject to the FCRA, because (1) Instant Checkmate had previously entered into a consent agreement with the FTC in 2014 to settle allegations that it did not comply with the FCRA, and (2) consumers had allegedly directly contacted the Companies about use of the Companies’ background reports for employment or tenant screening. The FTC then turned to whether the Companies had violated the FCRA and alleged that the Companies violated the Act by failing to: (i) follow reasonable procedures to assure maximum accuracy of consumer report information; (ii) provide FCRA-mandated user notices, outlining important consumer protections; and (iii) conduct reasonable investigations of consumer disputes.
The FTC also alleged that the Companies violated Section 5 of the FTC Act by advertising their reports as containing “the most accurate information available to the public” without verifying whether the information was accurate or current. The Companies also allegedly displayed advertisements that falsely claimed their product would identify individuals with criminal or arrest records, though customers received reports that contained no such records. Lastly, the complaint alleged that, despite representing that consumers could provide feedback on the reports through the use of “Remove” and “Flag as Inaccurate” buttons, the buttons neither removed nor corrected information from background reports.
Takeaway: In its press release, the FTC noted that “if you use keywords to promote your product for uses covered by the Fair Credit Reporting Act, it’s time to…pay attention to FCRA’s consumer protections.” The large fine undoubtedly is a result of the fact that this conduct followed the 2014 settlement by Instant Checkmate. Companies will want to review the keywords they are using to advertise products, not just the direct marketing and advertising they are doing, lest they run afoul of Section 5 of the FTC Act.
UK Data Regulator Publishes "Toolkit" For Sharing Data With Law Enforcement Agencies
The UK Information Commissioner’s Office (the “ICO”) has published an online ‘toolkit’ for organizations responding to a request for personal data from a law enforcement authority.
The toolkit leads users through a series of questions and accompanying guidance to help businesses determine whether they have enough information from the law enforcement authority to make an informed decision about the data sharing. The guidance indicates that organizations need to know exactly what personal data the law enforcement authority is asking for and the reasons why it is needed. Significantly, organizations are required to evaluate for themselves whether they are satisfied that providing the relevant data is necessary for the relevant law enforcement purposes. The data sharing does not need to be absolutely essential but must be more than just useful or ‘standard practice’.
The second part of the toolkit asks further questions targeted at the overarching principles under the UK GDPR and prompts users to identify the applicable legal grounds on which data can be shared. In many cases, organizations can rely on the “crime and taxation” exemption, which excludes certain rights of data subjects where those rights might otherwise prejudice the prevention or detection of crime or the apprehension or prosecution of offenders (for example, by alerting the data subject that they are under investigation). However, the “crime and taxation” exemption does not exclude the requirement to identify a “lawful basis” for the sharing of data or additional requirements for certain categories of data deemed to be particularly sensitive (such as health data and criminal conviction data).
On the basis of the user responses, the toolkit produces a template report for completion by the organization which prompts further consideration of compliance issues and is designed to assist with organizations’ record keeping obligations.
Takeaway: Law enforcement bodies have a position of authority that can put pressure on organizations that receive a request for information. However, businesses must be mindful of their obligations under data protection laws and in the UK will often need to seek further information from the law enforcement authority. The ICO’s toolkit is aimed primarily at small organizations with more detailed and nuanced guidance being available in the ICO’s data sharing code of practice. However, the toolkit is also valuable for organizations with sophisticated privacy programs to sense-check their analysis and to align with the ICO’s expectations.
California Legislature Passes the ‘Delete Act'
Senate Bill 362 – the “Delete Act” – was passed by the California State Legislature on September 14, 2023. If approved by Governor Newsom, the Delete Act would be the first-of-its-kind legislation, establishing a single-step process for Californians to order hundreds of data brokers registered with the state to delete their personal data.
While the California Consumer Privacy Act (“CCPA”) currently gives California residents the right to request that data brokers delete their personal information, it can be time-consuming and difficult for such individuals to identify and make separate deletion requests of every data broker that holds their information. The Delete Act would direct the California Privacy Protection Agency (the “Agency”) to establish “an accessible deletion mechanism that . . . allows a consumer, through a single verifiable consumer request, to request that every data broker . . . delete any personal information related to that consumer held by the data broker or associated service provider or contractor.” Additionally, while California law mandates that data brokers register with the California Attorney General, the Delete Act would require that brokers also disclose the types of personal information they collect.
The Delete Act would grant exclusive enforcement authority to the Agency and would not provide a private right of action. Failure to comply with the Delete Act, and honor consumer deletion requests, would result in fines of up to $200 per day of non-compliance. As with the CCPA, the Delete Act would not apply to personal information covered by various federal statutes, such as the Gramm-Leach-Bliley Act and Health Insurance Portability and Accountability Act.
Takeaway: Despite strong opposition, the Delete Act progressed swiftly through the California State Legislature and will likely become law. The legislation’s passage underscores growing regulatory sensitivity to concerns regarding consumer data privacy, making it increasingly important for businesses to establish and maintain effective procedures for responding to requests from individuals, including deletion requests.