Virtual reality: top data protection issues to consider

Dentons
Contact

Dentons

[co-author: Claudio Orlando Miele]

Why does virtual reality matter for data protection?

Virtual reality (VR) could be defined as a fully immersive computer-generated artificial environment, which is experienced through a user interface. VR encompasses computer technologies, and advanced systems where the user can interact with the virtual world via sensors, for example, by using headsets or gloves containing these sensors. While many businesses are embracing virtual reality technologies for training and educational purposes, VR products are also becoming steadily popular in the gaming industry (for further information about VR’s economic impact and major legal challenges, please read here).

The improvements of VR technologies involve some relevant issues from a data protection perspective, as advanced VR systems are able to know how we move around physically through the analysis of our movements (through VR sensors), and potentially even our brain waves. Even though this technology has not been fully deployed in the mass market, significant privacy concerns have already been raised.

What are the main privacy issues related to virtual reality?

VR involves the collection and processing of more – and more intimate – personal data than other “traditional” technologies. This gives rise to some considerations under the EU General Data Protection Regulation (Regulation 2016/679/EU – “GDPR”) and local data protection regulations.

  1. Processing of biometric data: VR technologies collect body-tracking data which are part of our deep-seated identity data by means of eye-tracking systems, facial recognition systems and advanced sensors (e.g. fingerprints, voiceprints, hand and face geometry, electrical muscle activity, heart-rate, skin response, eye movement detection, head position, etc.) in order to provide an immersive and comfortable experience for users. Such kinds of data are identifiable as biometric data, i.e., under art. 4(14) of the GDPR, “personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”. According to art. 9 of GDPR, their processing requires special attention as they are considered a special category of personal data. In particular the GDPR provides that the processing of biometric data (for the purpose of uniquely identifying a natural person) − except for some limited purposes, such as employment and social security law, vital and substantial public interests, purposes of preventive or occupational medicine, etc. − shall be prohibited unless the data subject has given explicit consent to the processing. In this regard, the Italian Data Protection Authority’s General Application Order Concerning Biometric Data (available here) reiterated that: (i) the processing of biometric data requires the provision of an information notice; (ii) the processing requires the data subject’s consent; (iii) biometric data must be protected by adequate security measures (e.g. encryption); (iv) access to databases containing biometric data must be tracked; (v) data must be retained as long as necessary for the processing purpose.
  2. (Non-)free consent: Consent is generally considered as a valid legal basis for the processing of biometric data. In order to ensure that a valid consent is granted, it is necessary to assess whether it is freely given. In this respect, consent could be considered as not freely given where a valid alternative to the processing of biometric data is not provided. In this regard, it is noteworthy that a data subject may have a different perception of privacy in a VR context (so called “virtual privacy”) than in a non-VR context, and that such “lower” privacy perception could also cause a reliance on less sustainable consent propositions. Indeed, consent might not be considered as freely given if the provision of a VR service (i) is strictly bound to the processing of biometric data (without any valid alternative for the data subjects); and (ii) is conditional on consent to the processing of personal data that is not necessary for the performance of that service (see Article 7(4) GDPR). Most of the abovementioned biometric data certainly appears to be necessary for enabling the use and availability of VR services, but there might be a question over whether data subjects have a “real choice” to refuse the processing and whether it is possible to draw the line between necessary and unnecessary data.
  3. Privacy by default / design and DPIA: Another issue to consider is the introduction, under article 25 of the GDPR, of the principles of privacy by default and by design, which require products / services to be designed and developed in order to protect user’s personal data by default. In particular, given “the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing” (GDPR, article 25(1)) of a large scale of biometric data due to the use of VR, VR providers shall:
    - both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organizational measures, such as pseudonymisation and/or data minimization, to meet the GDPR’s requirements and protect the rights of data subjects, and to ensure that only personal data that are necessary for each specific purpose of the processing are processed;
    - carry out, prior to the processing, an assessment of the impact of the envisaged processing operations on the protection of personal data, under article 35 of GDPR (Data protection impact assessment – “DPIA”).
  4. Data security: Given the context, the nature and the purposes of the processing of users’ personal data (and the amount of personal data processed), VR providers should minimize any potential data / information exposure. Some VR providers do not ensure the adoption of certain data security measures, such as encryption or pseudonymisation (which are standard practice in more traditional digital communication means such as instant messaging apps). Furthermore, certain VR systems also rely on third-party services or apps which do not appear to implement suitable security standards. It is obviously essential for VR providers to implement adequate policies and security measures (e.g. physical security of data, physical security of facilities/personnel, network security, system hardening, password security, endpoint protection, patch management, remote access, etc.) to satisfy legal requirements. A number of commentators are currently pushing for the identification, by governments (or even VR providers’ self-regulation bodies) of specific VR minimum security standards (e.g. SANS, NIST, ISO, CIS). Such standards would help VR operators to provide more secure products and services, thus fostering a wider deployment of VR solutions. Moreover, it is necessary for VR providers to ensure restoration of systems to ordinary operation as soon as possible; adequate business-continuity and disaster-recovery plans should accordingly be in place, also to address incident and security breaches.
  5. Children personal data: since VR technologies are addressed (also) to young gamers, it is necessary for VR providers to implement adequate procedures and security measures to protect children personal data (minors still constitute a wide portion of – increasingly sophisticated and tech savvy - gamers). Indeed, children need particular protection when data controllers are collecting and processing their personal data because are traditionally less aware of the risks involved in the processing of their data. VR providers should protect minors since the very beginning of their use of VR technologies, designing and implementing such technologies in compliance with the strict privacy by design and by default principles. The main issue is that, from a practical standpoint, it is often difficult to ascertain whether a VR user is a child, and, for instance, a valid parental consent has been given. VR providers should accordingly review the steps they are taking to protect children’s personal data on a regular basis, and consider whether they are able to implement more effective verification mechanisms (other than relying upon simple consent mechanisms).

Final remarks

VR technologies offer extraordinary business opportunities, but their implementation will require addressing at a very early stage a number of legal challenges (for instance, please read here for further information about VR and trademark/patent law). Ensuring an adequate data protection for all users will no doubt remain one of the main challenges, and in this respect self-regulation codes will not be able to address all concerns. Similarly to what is happening for AI applications (see also here), any development and distribution of VR technologies will require a considerate and proactive approach by all those involved in the VR value chain.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dentons | Attorney Advertising

Written by:

Dentons
Contact
more
less

Dentons on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide

This website uses cookies to improve user experience, track anonymous site usage, store authorization tokens and permit sharing on social media networks. By continuing to browse this website you accept the use of cookies. Click here to read more about how we use cookies.