UK Children’s Privacy Protection Comes of Age

The Age Appropriate Design Code (“AADC”) - more commonly known as the Children’s Code - has been heralded as the world’s first code to protect children online. Compliance with the AADC became mandatory for in-scope businesses in the UK on 2 September 2021, so we have had just over a year to observe its effects. Over the same timeline, the new UK Information Commissioner has been establishing himself in the role. From the start, Information Commissioner John Edwards has been clear that he will focus the ICO’s resources on data protection issues which are disproportionately affecting vulnerable groups, such as children.  His three year plan (ICO25), specifically identifies the ICO’s ‘on-going support’ for children’s privacy. 

Enforcement in the UK

The ICO has taken its first enforcement action directed at protecting children’s privacy. TikTok could face a fine of £27 million after the ICO found the company may have breached UK data protection law by failing to protect children’s privacy when using the platform. (The activities complained of took place between May 2018 and July 2020 and therefore preceded the AADC).  This follows an ICO investigation which found that TikTok may have:

  • processed the data of children under the age of 13 without appropriate parental consent,
  • failed to provide proper information to its users in a concise, transparent and easily understood way, and
  • processed special category data without legal grounds to do so.

The ICO’s announcement of a notice of intent means that this is the regulator’s provisional view only. TikTok now has the opportunity to make representations to the ICO, so any final fine may be lower (or withdrawn), although the present political climate in the UK around online protection of children is fairly febrile. The publication of a final notice is likely to be useful for other similarly affected businesses, by providing insight into the regulator’s assessment of the effectiveness of specific safety measures deployed and the consequent risks posed to the privacy of children. The ICO has also committed in ICO25 to a more transparent form of enforcement, potentially through the issue of new style “binding rulings” so the ICO can “declare [its] position on a business practice or question of law in advance, rather than always coming along after the fact”.

The ICO confirmed that it is looking into how over 50 different online services are conforming with the Children’s Code and has six ongoing investigations looking into companies providing digital services it believes have not “taken responsibilities around child safety seriously enough.”

The UK’s online child protection strategy: what the ICO will do next

More enforcement activity is expected this year by the ICO in relation to children’s privacy and that is likely to continue (and possibly ramp up) next year. In the ICO’s three year plan (ICO25), the emphasis for year two (October 2023 – 2024) is on enforcement of the Children’s Code, with the ICO using its influence in industry to ensure children benefit from an age-appropriate online experience.

As part of its strategy, the ICO will push for further changes by social media platforms, video and music streaming sites and gaming platforms to correctly assess children’s ages and conform with the Children’s Code’s guidelines about profiling children and sharing their data. It is also lobbying for changes to industry practice around improved transparency and use of privacy notices children can understand. Clear emphasis has been placed on the need to investigate companies who are not compliant with the Code and take firm enforcement action.  Work on this issue will be handled by the ICO together with other members of the Digital Regulation Cooperation Forum (DRCF) and the ICO will also be tracking the progress of the Online Safety Bill. See BCLP’s alert on the bill here. The ICO is currently looking to evaluate the impact of the Code and invited public views now that it has been in place for a year. Responses to the brief online survey are due by Friday 11 November 2022, with a report expected to be published in due course. The ICO has announced that it will also consider whether changes to the Code are required by legislative reform on data protection and to promote closer policy alignment with the Online Safety Bill.

What is happening elsewhere?

It is not only the UK which is becoming more active in protecting children’s privacy; we are seeing other countries adopt similar codes, as well as increased levels of child-centred enforcement activity (for example, the recent €405m fine issued by the Irish Data Protection Commission (DPC) against Meta Ireland with respect to Instagram).

The Meta decision (involving other concerned EU supervisory authorities, with areas of disagreement resolved by the EDPB through the GDPR’s consistency mechanism) addressed how the requirement for ‘specific protection’ for children (recital 38 EU GDPR) should be interpreted and resulted in other platforms making real-time changes to child account settings.

The DPC was concerned that the Meta notices / terms of use lacked transparency and did not adequately flag the consequences of choosing a business Instagram account. These included the failure at the registration stage to explain to child users the difference between public and privacy accounts. It is reported that there is also a child-related investigation pending against TikTok by the DPC. 

California has also recently passed the California Age-Appropriate Design Code Act which will require technology companies offering platforms popular with children to increase protections for users who are under 18 (such as: default higher privacy settings, requirements not to collect location data from these users and requirements to provide clear and comprehensible privacy information). These businesses will also have to consider how their algorithms and products will affect child users. This will come into force from 1 July 2024 and will affect those businesses offering online products and services which are ‘likely to be accessed by children’ under the age of 18.  It gives guidance as to the types of online services or products which will be within scope. The BCLP team has written about the Act and its scope, in a set of FAQs.    

What is on the horizon?

Businesses cannot afford to be complacent. In the UK, any business offering an online service likely to be accessed by UK children under 18 (even if not aimed at them) will need to ensure it complies with the Children’s Code standards and should also closely follow the progress of the Online Safety Bill. The EU’s data protection authorities are also championing significant fines, so serious compliance failures can expect to be sanctioned. Given the significant societal focus on protecting the rights of the young online, particularly in view of recent tragic cases, and the rapid rate of change in platforms and services, it is small wonder that regulators are looking to adopt a more agile, urgent approach to enforcement. 

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Bryan Cave Leighton Paisner | Attorney Advertising

Written by:

Bryan Cave Leighton Paisner

Bryan Cave Leighton Paisner on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide