Does Accidental Listening by Smart Speakers Raise Compliance Concerns?

Foley Hoag LLP - Privacy & Data Security

That sixth sense you have that someone is listening – could it be your smart speaker? There’s a chance the answer is yes, even when you don’t ask it to. A new study from Northeastern University finds that smart speakers often accidentally activate and record conversations, although just how often (sometimes as often as 19 times a day) and for how long (sometimes recording for 43 seconds) depends on the device. Notably, the study also found that devices are not consistently recording conversations, and that the accidental listening and recording could be triggered by dialogue in television shows.

Setting aside whatever concerns you might have as an individual about the information being captured and recorded by your home’s smart speaker (and to be clear, those concerns are important), the accidental capture of potentially sensitive information raises a set of compliance issues for smart speaker designers. For example, under the California Consumer Privacy Act (CCPA), an entity that collects information from California residents (assuming other criteria are met) must disclose what information it is capturing “at or before the point of collection.” Do the terms of use or privacy policies related to the smart speakers reflect such accidental capture? The EU’s General Data Protection Regulation (GDPR) requires “Data Protection by Design and by Default,” and specifically notes that data controllers (those determining how personal data is to be processed) must

both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures . . . which are designed to implement data-protection principles . . . in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

Does a smart speaker repeatedly turning on and accidentally recording information meet the requirements of “data protection by design”? Does it matter what the data is being used for? Even if the data is being used to improve the speaker and avoid the accidental capture, the fact that the user is not giving permission and is without knowledge of the capture could run afoul of the GDPR’s requirements.

Even if laws like the CCPA and GDPR are not directly implicated, what about the privacy policies themselves? Do they put the user on notice that there is a risk of accidental capture? If not, that could raise consumer protection issues in the United States, implicating Federal Trade Commission or state attorney general enforcement.

In short, accidental capture of personal data by smart speakers could raise a variety of compliance concerns, some of which might be resolved through terms and polices, but others that might only be resolvable through design fixes. But for now, you might want to move your speaker away from your television set.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Foley Hoag LLP - Security, Privacy and the Law | Attorney Advertising

Written by:

Foley Hoag LLP - Security, Privacy and the Law
Contact
more
less

Foley Hoag LLP - Security, Privacy and the Law on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide