UK publishes Online Safety Bill: a new era in digital regulation

Hogan Lovells
Contact

Hogan Lovells

Over 2 years since the UK Government first laid out its plans to introduce a new regulatory framework for “online harms”, the Online Safety Bill was published on 12 May 2021. The Bill imposes duties of care on providers of digital services to make them responsible for illegal and harmful content generated and shared by their users as well as duties to protect users’ rights to freedom of expression and privacy. Ofcom will oversee and enforce the new regime and will be responsible for publishing codes of practice to set out recommendations with which businesses can comply to fulfil their duties. The Bill will be subject to pre-legislative scrutiny by a joint committee of MPs before being formally introduced to Parliament later this year, so businesses should now consider the Bill’s contents, engage with the committee and prepare for the new framework.

Key takeaways

  • As expected, the focus of the regime is on the adequacy of systems and processes, rather than individual content moderation decisions.

  • The regulatory approach is principles-based. Businesses must either comply with the steps recommended in Ofcom’s codes of practice or justify a departure from those steps by reference to “online safety objectives” set out in the Bill.

  • Ofcom has wide-ranging powers and significant teeth, both in terms of setting expectations via the codes of practice and thereby defining the parameters of the relevant duties, but also in monitoring compliance and taking enforcement action.

Who is affected?

The businesses in scope are providers of “user-to-user services” and “search services”, as defined by Clause 2 of the Bill. In short, services are in scope, if  they:

  • allow users to share, generate or upload content online which may be encountered by others through a functionality on the service; or

  • are, or include, a search engine.

A broad range of businesses are captured by the definition of “user-to-user” service (including social media platforms, online marketplaces, dating apps, review websites, forums etc.) because of the wide meaning given to the term “functionality” by the Bill.

Notably, and as foreshadowed by the Government’s December 2020 full consultation response, services that allow only private online interactions between users, such as direct private messaging, are in scope.

The Bill has extra-territorial effect and applies to services provided from outside the UK, as long as they have “links with the UK”, i.e. the service must have a significant number of UK users, the UK must be a target market; or it must be capable of being used in the UK and giving rise to a material risk of significant harm to individuals in the UK.

Differentiated obligations

Providers of user-to-user services which meet certain thresholds will be designated “Category 1 services” and subject to additional duties in respect of legal content that is harmful to adults, “content of democratic importance” or “journalistic content” (see below).

Category 1 services will be designated by Ofcom in accordance with thresholds set out in regulations made by the Secretary of State under the Bill. Those thresholds will relate to the number of users of the service and its functionalities.

What are the new duties?

There are different duties applicable to user-to-user services and search services as well as to Category 1 services.

All user-to-user services, will be required to:

  • identify and assess risks arising from (i) illegal content and (ii) (if the service is likely to be accessed by children) legal content that is harmful to children;

  • take steps to mitigate the risks identified and design and use proportionate systems and processes to minimise the presence of illegal content on the service and remove that content when made aware of it;

  • make clear in accessible terms of service how individuals are to be protected and apply those terms consistently;

  • have regard to users’ rights to freedom of expression and privacy when designing and implementing user safety policies and procedures;

  • establish user reporting mechanisms which allow users to report content they consider to be illegal or harmful as well as effective, easy to use and transparent complaints procedures; and

  • keep records of risk assessments and regularly review compliance with relevant duties.

Importantly, “illegal content” is defined as content that amounts to a relevant criminal offence and therefore, content giving rise to civil liability, such as defamatory content, is outside the scope of the Bill.

Category 1 services will be subject to additional duties to:

  • assess and mitigate the risk of content that is legal but harmful to adults;
  • take into account the importance of “content of democratic importance” (e.g. content promoting or opposing a political party) when making decisions on how to treat users who share it, whether to take the content down or whether to restrict  access to it and ensure that a diversity of political opinion is treated equally in doing so; and

  • take into account the importance of “journalistic content” when making decisions on how to treat users who share it, whether to take the content down or whether to restrict access to it and ensure that a dedicated and expedited complaints procedure is available in respect of journalistic content.

The Bill makes separate provision for search services but they are subject to similar duties to user-to-user services that are not in Category 1 (e.g. to carry out risk assessments, design and use systems and processes to address illegal content, protect freedom of expression and privacy and put in place user reporting and complaints procedures and record keeping and review mechanisms). There are additional duties for search services that are likely to be accessed by children to address content that could be harmful to them.

Oversight and Enforcement

Under the Bill, Ofcom is required to produce specific codes of practice in relation to the duties to address illegal terrorist content and child sexual exploitation and abuse content.

Ofcom is also required to produce one or more codes of practice in respect of other relevant duties under the Bill.

The codes of practice must be produced in line with the “online safety objectives” specified in Clause 30 of the Bill, which relate mainly to the adequacy and proportionality of services’ systems and processes for risk management and compliance.

The codes of practice must describe “recommended steps for the purposes of compliance”. As such, businesses will have scope to depart from the codes of practice but will be required to have regard to the online safety objectives and the rights of users in doing so and it will ultimately be for Ofcom to decide whether they have complied.

Ofcom has extensive information gathering powers under the Bill to support its enforcement functions, including powers to require the provision of information, investigate potential breaches,  require a report from a skilled person into potential breaches and even enter and inspect premises.

Ofcom also has a range of sanctions available to it, including:

  • issuing a notice of non-compliance;

  • issuing financial penalties up to the amount of the greater of £18 million or 10% of global turnover; and

  • taking so-called “business disruption measures”, which would allow Ofcom to apply for a court order to require the withdrawal of ancillary services (e.g. payment services) provided to the non-compliant business or enable user access to the non-compliant business to be blocked.

The Bill provides for senior managers of service providers to be criminally liable for a failure to comply with an Ofcom information notice but this provision will not be brought into effect until after a Government review of the entire framework, which is to take place at least two years after the regime comes into effect.

Next steps

The Bill will now undergo pre-legislative scrutiny by a joint committee of MPs whose members have not yet been named. The committee is expected to take evidence and report back to the Government on the Bill with non-binding recommendations and businesses may be able to submit evidence to it. The Bill will be formally introduced to Parliament later this year.

In the meantime, businesses can start to prepare. Many of the Bill’s requirements (e.g. risk assessments, risk mitigation measures, transparency of terms of service) are also seen in the EU’s draft Digital Services Act but there are important differences between the regimes and businesses providing services to UK users from the EU will need to carefully consider how to comply with both in parallel.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Hogan Lovells | Attorney Advertising

Written by:

Hogan Lovells
Contact
more
less

Hogan Lovells on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide