When the UK Government first introduced the Online Safety Bill (the "Bill") to Parliament, it lauded the Bill as creating "world-leading online safety laws" which "marked a milestone in the fight for a new digital age".1 The Bill is currently at committee stage, where it is being scrutinised by MPs who have heard evidence from tech giants including TikTok and Twitter, alongside online safety advocates like the National Society for the Prevention of Cruelty to Children and the Center for Countering Digital Hate.
If enacted, the Bill will create a new regulatory and enforcement framework requiring Online Content Providers ("OCPs") to tackle illegal and other harmful content on their services. It mirrors laws recently proposed in the EU and US, and has the potential to be a leading legislative model for other countries seeking to improve online safety by regulating OCPs.
This OnPoint summarises the main elements of the UK Bill and examines the global trend towards increased regulation of OCPs, and how that will impact OCPs who provide online services around the World.2
New Regulatory Obligations
As a minimum, the Bill will impose statutory duties of care on social media platforms, online forums and search engines that host user-generated content as well as sites featuring adult content ("OCPs") to:
- Assess their user base and the risks of harm to users from content on the service (and update their risk profile as and when risk profile changes);
- Take active steps to mitigate the risks of harm to individuals arising from illegal content and activity, and for services accessed by children, activity that is harmful to children (what will amount to harmful activity will be defined in regulations);3
- Implement systems and processes to allow the reporting of specified types of content;
- Establish adequate complaints procedures for specified content; and
- Put in place systems and processes to ensure that criminal content is reported to the UK National Crime Agency.4
The Bill also requires the Secretary of State to pass regulations specifying threshold conditions by which OCPs’ services will be categorised as Category 1, Category 2A or Category 2B. Additional duties will be imposed on OCPs providing Category 1 services, including an enhanced duty to carry out and record risk assessments, a duty to protect adult users’ online safety, a duty to empower users to take greater control over their exposure to harmful content, and duties to protect content of democratic importance and journalistic content.5 The threshold conditions will be set with reference to the OCP’s number of users and the functionality of its services.6
The Bill also imposes various duties relating to transparency, reporting, user identity verification and payment of fees. The administrative burden on OCPs will be substantial, and the Government estimates that it expects OCPs collectively to spend anything from £50 million to £95 million on transition costs, followed by an estimated £290 million in annual costs thereafter.7
Investigation, Enforcement and Penalties
The Bill appoints Ofcom, the UK regulator responsible for regulating communications services including TV, radio and post office, with responsibility for enforcement and oversight of the regime. It gives Ofcom new criminal investigatory and enforcement powers including the power to compel OCPs to provide information and witnesses to attend interviews. It also gives Ofcom new powers of entry, inspection and audit.8 It creates criminal offences relating to failure to cooperate with Ofcom investigatory measures, and provides for the possibility of joint liability for parent and subsidiary companies in certain cases where Ofcom deems it appropriate.9 It will also allow Ofcom to apply to the English courts for "business disruption orders" requiring OCPs to withdraw services or, in extreme cases, blocking access to non-compliant OCP services.10
The Bill stops short of imposing criminal liability on OCPs for failing to comply with their statutory duties, but in such cases Ofcom may impose financial penalties of up to £18 million or 10% of the OCP’s qualifying worldwide revenue in the most recent complete accounting period, whichever is the greater.11 Where two or more entities are jointly and severally liable for a penalty, the maximum penalty will be the greater of either £18 million or 10% of the qualifying worldwide revenue for the group.12
Separately, the Bill updates some of the existing communications offices in England and Wales, creating three new communications offences for (i) sending online threats and harassment, (ii) sending false communications with intent to cause psychological or physical harm, and (iii) sending unsolicited sexual pictures and videos. The purpose of the new offences is to enhance the protection of vulnerable users and to reduce online abuse. The offence of sending false communications will require prosecutors to show that the person sending the message knew at the time of sending that the message was false, and that it was likely to cause non-trivial psychological or physical harm to its audience.13 The offence cannot therefore be committed by OCPs whose users publish false information on their platform, unless the prosecution could show that the OCP knew at the time the message was sent that it was false and likely to cause non-trivial psychological or physical harm to its audience.
The response to the Bill has been varied. Gill Whitehead, the head of the UK Digital Regulation Cooperation Forum, a new group created to streamline internet regulation expressed concern that the law could "weigh down small businesses with new costs" and stifle important innovation.14 Others believe it does not go far enough. The End Surveillance Advertising to Kids coalition flags that the Bill does not restrict OCP’s powers to collect and use data to aim targeted advertising at children, unless the advertising in question falls within the definition of harmful.15 Whichever side of the argument you agree with, there is no denying that the Bill will impose additional administrative and financial burdens on the OCP market.
The UK is not alone in legislating to regulate OCPs. In April, the EU agreed the text of a new Digital Services Act ("DSA") which, similarly to the UK Bill, aims to increase accountability for online platforms regarding illegal and harmful content.16 Separately, the EU has agreed the text of a new Digital Markets Act ("DMA") which will increase competition by forcing companies who provide browsers, social networks and search engines designated as "gatekeepers" (in that they have at least 45 million monthly end users and at least 10,000 yearly active business users in the EU) to allow users greater flexibility in terms of uninstalling pre-installed apps, achieve interconnectivity between different apps and services, and curtailing targeted advertising.17 The DMA and DSA will be enforced by the European Commission, and are currently awaiting formal approval by the EU Parliament and Council, later this year.18
Further afield, the US is also making inroads in this area, following criticism that it has historically failed to regulate big platform companies operating in its own backyard.19 In February 2022, US senators introduced the Kids Online Safety Act to Congress, seeking to impose a duty on OCPs to prevent the promotion of harmful and criminal activity, and limit exposure to harmful behaviours like suicide, self-harm, eating disorders and substance abuse. In fact, the US Bill goes further than the UK Bill, in that it will require OCPs to give parents greater control to opt out of data mining and algorithmic recommendations.20
Some critics of the EU and UK legislators argue that the trend leans towards protectionism but, as the US introduces its own draft legislation at a federal level, that argument is falling away. Whatever your perspective, it is clear that OCP regulation in the near future is a legal certainty, and OCPs need to start preparing quickly. The financial costs and administrative burden will be significant. The UK Government intends to raise the funds to cover the costs of regulation in the UK from industry in the form of fees.21 Add to that the costs of risk assessments, implementation of mitigation measures, transparency reporting and cooperating with regulatory investigations, the financial cost to OCPs will increase sharply as and when these Bills become law. OCPs are already feeling the pinch as investors get nervous about the potential impact of the proposed laws on OCP profitability.
1) Press Release “World-first online safety laws introduced to Parliament”, 17 March 2022.
2) Unless specified, all provisions cited are taken from the Online Safety Bill, Bill 4, 53/8.
3) Section 53.
4) Online Safety Bill Explanatory Notes, Bill 258-EN (“Explanatory Notes”), paragraph 19.
5) Part 3, ss 12 – 16.
6) Schedule 10, Paragraph 1.
7) Online Safety Bill Impact Assessment, Full Economic Assessment, Page 2, see here.
8) Schedule 11.
9) Section 161 and Schedule 14.
10) Explanatory Notes, paragraph 573.
11) Section 122(4)(1).
12) Section 122(5)(2).
13) Section 151.
14) Financial Times, Online safety bill risks stifling start-ups, says UK tech regulator chief, 28 April 2022, see here.
15) End Surveillance Advertising to Kids coalition—written evidence to the House of Lords Communications and Digital Committee inquiry into Digital Regulation, dated October 2021 (DRG0017), see here.
16) See here.
17) See here.
18) See here.
19) For example see Brookings’ online article “U.S. regulatory inaction opened the doors for the EU to step up on internet”, 29 March 2022.
20) See here.
21) See here.