DCMS Online Harms white paper and regulation of internet companies

BCLP
Contact

Summary

Online content in the UK is to be regulated by a new internet regulator. A white paper consultation was published on 8 April 2019 by the UK government Department for Digital, Culture, Media & Sport (DCMS). The proposals include a new “duty of care” for internet companies. It focuses especially on combatting online child sexual exploitation and abuse (CSEA) and terrorist content, and a range of other online harms from modern slavery to trolling.

1. What was published?

The consultation published by DCMS is entitled “Online Harms White Paper”, and is open until 1 July 2019. It runs to nearly 100 pages long and has 18 questions.

2. What is the scope of the proposed regulation?

The regulatory framework will apply to companies that provide services or tools that enable users to access user-generated content, or interact with each other online. Companies in scope include social media platforms, file hosting sites, public discussion forums, messaging services and search engines. Companies based outside the UK may be required to appoint a UK or EEA-based nominated representative. Companies will be required to combat both illegal content and activity as well as behaviour that is considered harmful but is not necessarily illegal.

3. Which body willl be the regulator and how will it be refuended?

The white paper consults on whether the online harms regulator should be a new or existing public body. Existing regulators for online and broadcast content include Ofcom and the Information Commissioner. Video sharing platforms will be regulated from 2020 under the revised Audiovisual Media Services Directive. The regulator will be funded by industry in the medium term, for example by fees, charges or a levy.

4. What online harms will be targeted?

 The online harms in scope include, for example:

  • Child sexual exploitation and abuse (CSEA)
  • Hate crime
  • Terrorist content and activity
  • Encouraging or assisting suicide
  • Organised immigration crime
  • Incitement of violence
  • Modern slavery
  • Sale of illegal goods/services, such as drugs and weapons
  • Extreme/revenge pornography
  • Content illegally uploaded from prisons.
  • Harassment and cyberstalking
  • Sexting of indecent images by under 18s

The scope could also extend to harms with a less clear definition, such as:

  • Cyberbullying and trolling
  • Disinformation
  • Extremist content and activity
  • Violent content
  • Coercive behaviour
  • Advocacy of self-harm
  • Intimidation
  • Promotion of Female Genital Mutilation (FGM)

Also included in the scope of online harms is the exposure of legal content to underage children: children accessing pornography or other inappropriate material, including under 13s using social media and under 18s using dating apps. Even excessive screen time for children is proposed to be in scope.

5. What approach will the regulator take?

The regulator will promulgate codes of practice. These will outline the systems, procedures, technologies and investment in staffing, training and support of human moderators, that companies will need to adopt to help demonstrate they have fulfilled their statutory duty of care. Companies will need to do what is “reasonably practicable”, a test familiar in other fields of regulation such as health and safety.

If companies choose not to follow the codes of practice, they must explain and justify their alternative approach. The regulator will assess how companies enforce their terms and conditions, as part of any regulatory action. Their terms and conditions must be clear and accessible, including to children and other vulnerable users.

6. What will companies need to do?

Companies will not be expected to undertake general monitoring of all communications on their online services, as this would be disproportionate and impact user privacy. Furthermore, any requirements to scan or monitor content for tightly defined categories of illegal content will not apply to “private” channels, a term which is to be defined. However, specific monitoring may be mandated where there is a threat to national security or the physical safety of children.

The regulator will have the power to require annual reports from companies including on:

  • Evidence of effective enforcement of the company’s own terms and conditions.
  • Processes for reporting illegal and harmful content and behaviour, and how many of those reports led to action.
  • Proactive use of technological tools to identify, block or remove illegal or harmful content.
  • Measures to uphold fundamental rights, ensuring decisions to remove content, block and/or delete accounts are well-founded, especially when automated, and that users can appeal effectively.

The regulator will also expect to see companies are cooperating with UK law enforcement and other official bodies; and investing in user awareness of online harms.

7. What sanctions are there for enforcement?

The regulator will have the power to issue substantial fines. The government is also consulting on powers that would enable the regulator to disrupt the business activities of a non-compliant company, measures to impose liability on individual members of senior management (civil fines or criminal liability), and measures to block non-compliant services. ISP blocking is described as an enforcement option of last resort.

8. Why now?

In May 2018, the government responded to the Internet Safety Strategy Green Paper consultation, setting out the role of transparency and reporting, with the first annual transparency report expected later in 2019. A House of Lords Select Committee paper published on 9 March 2019 also addressed the challenges of “Regulating in a digital world”.

However, the white paper most immediately responds to the recent terrorist attack in Christchurch, when footage was widely shared across internet platforms; and the suicide of teenager Molly Russell, who had viewed images of self-harm online before taking her life. The paper also seeks to present itself as a response to the recent rise in knife crime in the UK, where rival gangs are said to use social media to glamorise weapons, gang life and violence.

The consultation should also be seen in the context of public condemnation of privacy abuses by Cambridge Analytica, and online terrorist propaganda from Daesh.

9. How effective is internet regulation likely to be?

The government proposes taking a proportionate approach, focussing first on larger internet companies and where there is greatest risk of harms, or children or other vulnerable users are at risk. Public opinion following events such as the Christchurch attack and the Cambridge Analytica affair, is already causing a shift in approach at some of the larger internet companies. Formal regulation can only increase this trend. However, it seems likely to take a long time for changes in approach to be widely adopted by start-up companies and by companies operating outside the UK.

10. What should companies do now?

Legislation is not expected until late 2019 or 2020. Internet companies should see the white paper as a heads up as to the direction of travel. Although it may be some time before the regulator is in place, and although enforcement may seem remote, the white paper states that there is an expectation companies will take action now. There is also a widely shared body of public opinion that things need to change.

Companies would be well advised already to heed the spirit of the white paper when re-drafting user terms and conditions; when designing new online services and platforms; and when considering their approach to online content review procedures.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© BCLP | Attorney Advertising

Written by:

BCLP
Contact
more
less

BCLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide