A whole new world online: The UK government introduces the Online Safety bill to Parliament

A&O Shearman
Contact

Allen & Overy LLP

The genie is out of the bottle on what can only be described as a revolution in the online operations of certain companies providing services in the UK. After a number of developments during February and March 2022, the UK Government has finally introduced the Online Safety Bill (the Bill) to the UK Parliament for legislative scrutiny.

We now have a much better idea of the key focuses of the Bill as it goes through the legislative process. Despite this, questions still remain, not least how Ofcom, which has been charged with the task of promoting online safety, can in practice ensure that companies comply with the proposed new legislation in a way that mitigates the risk of user harm without compromising important freedoms that both companies and their users enjoy.

What’s in the Bill?

The Bill is extensive and, assuming it is passed in similar form, would create a vast new area of law. Its key features are considered below.

Broadly speaking, the Bill will apply to “user-to-user services” and “search services” with links to the UK – companies that are generally dubbed “service providers” in the draft legislation. These are the internet services by which content may be generated, uploaded to or shared by a user, or encountered by another user (see “Extra territorial effect” below in relation to territorial scope of the Bill). The Bill’s scope is therefore broad and will apply to companies ranging from social media giants to online gaming companies and messaging boards. It also captures search engines even if they do not provide “user-to-user services”. There are also specific provisions (not addressed further here) directed to regulated providers of pornographic content and, helpfully, the Bill clarifies a number of notable exemptions (including email, SMS and one-to-one aural communications, internal business services and comments/reviews on provider content, amongst others).

Extra territorial effect

By capturing those services with “links to the UK”, the Bill has extra-territorial effect. A service has links to the UK if it has a significant number of UK users or UK users form one of the service’s target markets. A service will also be considered to have links with the UK if it is capable of being used in the UK by individuals and there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK presented by user-generated content or search content of the service.

Nature of content and company categorisation

One complexity for companies caught by the Bill will be how they navigate the various types of “content” that the Bill covers to ensure their systems and controls are suitable, as the obligations to which companies are subject varies by the nature of the relevant content. The meaning of “content” under the Bill ranges from illegal content (e.g. content relating to terrorism or child sex exploitation) to content that is “legal but harmful” to adults or children (i.e. content for which there is a material risk that the content could have a physical or psychological impact on an adult or child).

Illegal and legal but harmful content may then be further designated as “primary priority” or “priority” content, meaning it is earmarked by the UK Government as content that is subject to additional scrutiny and measures. In addition to content that amounts to terrorism or child sex abuse offences (covered by the original draft Bill) the Bill currently defines “priority illegal content” to include content that amounts to criminal offences relating to (amongst others) fraud, online drug and weapons dealing, hate crime, the promotion or facilitation of suicide, and sexual exploitation. Previously it was envisaged that companies themselves would need to interpret what type of content would be legal but harmful. However, it is now clear that legal but harmful content will be designated in secondary legislation, which may help companies understand and apply the rules.

A further complexity is that the application of the Bill will not be uniform for all companies caught by it. Companies will be categorised according to whether the service they provide meets certain thresholds.

  • These thresholds are yet to be confirmed (to be specified by further secondary legislation) but will relate, for example, to the number of users and functionality of the services provided.
  • A small group of “Category 1” companies will be subject to the most stringent requirements, including requirements related to legal but harmful content.
  • The majority of companies caught by the regulations will likely be in “Category 2” where the focus of the requirements relates to illegal content.

Duties of care

The UK Government’s refrain during the evolution of the Bill has been that: “what is unacceptable offline will also be unacceptable online.” However, the scale of this task is that many things will now be considered unacceptable – and indeed unlawful – online.

The Bill is organised around a number of “duties of care” with which companies must comply, depending on the nature of their services (user-to-user or search service) and categorisation. Within these duties, companies will need to have regard to different considerations that may apply specifically to adults or children. These duties depend on whether they apply to illegal content, legal but harmful content or both, and include:

Illegal and legal but harmful content duties
  • conducting and maintaining risk assessments in relation to illegal content and legal but harmful content, and taking proportionate steps to mitigate and manage the risk of harm of content to individuals as identified in that assessment;
  • providing transparency in clear, accessible terms of service or a policy statement regarding how illegal and legal but harmful content is to be dealt with and how individuals are protected from the same, applying the terms of service consistently;
  • safeguarding freedom of expression, privacy, democratic content and journalistic content (the latter two being relevant only to Category 1 companies), for instance through impact assessments of safety policies and procedures, accounting for such freedoms when making decisions about how to treat content and enabling an expedited complaints procedure;
  • compliance with “the three Rs” requirements: reporting, recordkeeping and review; and
  • duties addressing anonymous online abuse by requiring certain user-to-user service providers to enable adults to verify the identity of other users and block those who have not verified their identity.
Illegal content duties
  • using proportionate systems and processes to limit the risk of individuals encountering priority illegal content or other illegal content when alerted to the same. While the user sharing the relevant content would be the party committing the criminal offence itself, service providers will need to take proactive steps to identify this priority illegal content and prevent other users from seeing it; and
  • a separate duty for providers of Category 1 services to put in place proportionate systems to prevent (or in the case of Category 2A services, minimise) paid-for fraudulent adverts on those services and remove the same when made aware of them. This is a response to online advertising reform and sits alongside an ongoing consultation on the wider industry.
Legal but harmful content duties
  • in some cases, dealing with content that, despite the fact it is legal, is harmful to adults and/or children, including for example meeting Ofcom content notification requirements and using proportionate systems and processes designed to prevent children from encountering harmful content; and
  • Category 1 service providers will also be required to provide tools to allow adults to opt-out of exposure to certain harmful content that might otherwise be tolerated by the service.

Separately, the Bill requires companies to report child sexual exploitation and abuse content detected on their platforms to the National Crime Agency.

Service providers will need to consider if and how they will need to change their existing infrastructure to comply with this new layer of regulation, albeit some of which will likely mirror steps that they have already taken voluntarily in the era of ‘self-regulation’. This may require a material change to the way in which service providers operate, by taking a more proactive approach to relevant content rather than addressing its removal once it has been reported. In particular, the UK Government has advised that service providers will need to ensure that, “features, functionalities and algorithms of their services are designed to prevent their users encountering [this content] and minimise the length of time this content is available”. Whether this is through automated, algorithmic mechanisms or human intervention, service providers will need to consider whether their existing policies, procedures, systems and controls remain adequate under the new legal framework.

Given the nature of the task, we expect that useful lessons can be learned from the experience of regulated financial services sector firms, particularly around designing appropriate policies, procedures, systems and controls to facilitate compliance.

However, this exercise will be all the more challenging as the proposed new duties on companies caught by the Bill are not always clear or aligned. For example:

  • What factors will service providers be required to take into account when performing risk assessments for relevant, specified harms?
  • How will service providers set thresholds to detect content that may involve illegal conduct without impinging on freedom of expression or democratic content?
  • If a system is moderating or removing content that should be protected by (for example) freedom of expression, what will service providers do to adjust this and how quickly can they react?
  • Who, within the service provider, will perform an oversight role in respect of the relevant systems and controls, and how will issues be escalated to them?
  • How will service providers organise themselves and implement procedures to respond to objections raised by users about the moderation or removal of content? In doing so, how will they demonstrate compliance to Ofcom when it performs its oversight function?

There are clear enforcement risks where service providers perform inadequate risk assessments: complaints pile up and/or the systems that companies deploy are too liberal in removing content from a website, or there is ineffective oversight of the systems and controls in place. It is therefore important to consider carefully how the Bill may affect your operations.

Ofcom codes of practice

Many questions regarding the necessary operational changes required by the new law may be answered (or, at least, guided) by the codes of practice that Ofcom is to consult upon and then publish. The codes of practice are intended to describe the recommended steps that service providers should take to comply with the relevant duties in practice and will set expectations around the use of proactive technologies relating to content moderation, user profiling and behaviour identification to protect users.

To demonstrate compliance with the legislation, it will therefore be important for service providers to be able to evidence what they have done and how that sits against the recommendations in codes of practice. Failure to do so could lead to enforcement action.

Ofcom enforcement powers

Ofcom will have powers to investigate potential breaches of the requirements. This includes powers to require interviews or information during an investigation, as well as to impose a penalty on a service provider.

The penalties may have serious implications for a service provider. Ofcom will have powers to impose a significant financial penalty (the higher of GBP18 million or 10% of annual global turnover – higher than the current penalties for a GDPR breach) or even to block access to sites (the latter rendering the website completely redundant).

Ofcom can also issue “technology warning notices” if it considers there are reasonable grounds for believing that the service provider is failing in its duties relating to terrorism or child sexual exploitation and abuse content. These notices will require the service provider to use accredited technology to identify and remove relevant content.

Ofcom also has a more general power to require information in the exercise of its online safety functions other than in the context of an investigation.

Criminal offences under the Bill

Various criminal offences may also be committed in relation to Ofcom’s powers. For example, where a person fails to comply with an information notice requirement, fails to provide a document, knows or is reckless to the fact that the document is false in a material respect, or intentionally encrypts the document in such a way that it is not possible for Ofcom to understand it. A failure to attend an Ofcom interview without a reasonable excuse will also be a criminal offence.

Senior managers in the online safety context

A novel aspect of the proposed regime – at least in the technology sector – is the role of service providers’ senior managers. A senior manager in this context is an individual who plays a significant role in managing or organising a service provider’s relevant activities or making decisions relating to the management and organisation of the relevant activities.

There is a proposed requirement for service providers to name a senior individual in response to an Ofcom information notice, thereby encouraging a form of individual accountability for the entity’s provision of information to Ofcom.

Related to this are proposed criminal offences that may be committed by the senior manager themselves. These relate to:

  • a failure to comply with Ofcom’s information notice;
  • the provision of false or encrypted information to Ofcom in response to an information notice; and
  • the destruction of information.

An offence will be committed where the senior manager has failed to take “all reasonable steps” to prevent the commission of the offence by the service provider. These offences would be punishable with up to two years’ imprisonment or a fine.

The nature and extent of senior manager responsibility has been considerably beefed up since the publication of the original draft Bill in May 2021. Previously, Ofcom’s powers relating to senior manager liability were planned to be deferred for two years. However, the current draft Bill envisages that senior manager liability will apply a mere two months after the Bill becomes law.

Service providers should therefore begin to think about who should take on this responsibility and whether they have the framework necessary to ensure that the relevant individual is able to perform the role given the seriousness of the implications for failure to do so.

Does the Bill fit the bill?

The draft legislation is still in the early stages of Parliamentary scrutiny with some way to go before it becomes law. While the intentions of the Bill are noble, albeit ambitious, there is an increasing risk of it becoming unwieldy as it has expanded to cover a broad range of online activity. Concerns have also been raised within the tech industry that the wide-reaching reforms would mean the UK regime for intermediary liability is out of step with – and much more stringent than – other jurisdictions.

Service providers will need clear guidelines from Ofcom’s codes of practice to understand what is needed to comply with the new laws in practice and then to form a comprehensive compliance programme to implement the myriad of new rules. Secondary legislation is set to play a key role in defining the scope of the Bill’s application in practice, which in itself leads to further uncertainty as that legislation has not yet been published and will necessarily be subject to less parliamentary scrutiny.

The deferred powers relating to senior managers currently are clearly intended to encourage service providers to comply from the outset, as failure to do so could lead to serious implications for those senior managers in very short order. Whether the Bill could go further in identifying the roles and responsibilities of the senior managers in the same way as (for example) the financial services sector remains to be seen. However, in any event, the individuals performing these roles should seek to engage now and be cognisant of their and their employers’ new obligations.

As the Bill engages a number of important but conflicting rights, it must strike the right balance between its competing objectives, that is: preventing the publication of illegal or harmful content; protecting social and moral rights such as freedom of expression; and practicality and proportionality in its implementation by companies subject to the regime.

Companies operating online should continue to monitor carefully the progress of the Bill and engage at an early stage with its compliance in practice, particularly to avoid enforcement action down the line.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© A&O Shearman | Attorney Advertising

Written by:

A&O Shearman
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

A&O Shearman on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide