On February 12, 2020, the UK Home Office and Department for Digital, Culture, Media & Sport published the Government’s Initial Consultation Response (“Response”) to feedback received through a public consultation on its Online Harms White Paper (“OHWP”).  The OHWP, published in April 2019, proposed a comprehensive regulatory regime that would impose a “duty of care” on online services to moderate a wide spectrum of harmful content and activity on their services, including child sexual abuse material, terrorist content, hate crimes, and harassment.

While the Response does not indicate when the Government expects to introduce proposed legislation, it provides clearer direction on a number of aspects of the proposed regulatory framework set out in the OHWP, including:

  • Scope of regulation. The OHWP had indicated that the proposed duty of care would apply to any online service that either (1) facilitates the hosting, sharing, or discovery of user-generated content; or (2) facilitates online interactions between users.  it’s the Response clarifies that business-to-business services would fall outside the scope of this definition.  In addition, a business will not be subject to this duty of care simply because, for instance, it has a page on a social media service; in such circumstances, it would be the provider of the social media service, not the company operating the individual page, that would be subject to the duty of care.  However, a business operating its own website that includes functionality allowing people to post content or interact would fall within scope.
  • Scope of the duty of care. Many respondents to the consultation raised concerns that imposing a duty of care on online services to moderate content could infringe the right to freedom of expression.  Acknowledging these concerns, the Government indicated that it will take a different approach  to content and activity that is illegal (g., hate crimes) than to harmful but legal content (e.g., cyberbullying).  While the duty of care will require companies to expeditiously remove illegal content from their services, they will not have a similar obligation to remove legal content.  Instead, companies will have to state publicly what content and behaviors are unacceptable on the service (for instance in their terms of service), and to have systems in place to enforce these statements consistently and transparently.
  • Role and identity of the regulator. The OHWP indicated that an independent regulatory should oversee the regulatory framework.  The Response indicates that the Government is likely to assign this role to Ofcom, the UK’s telecommunications regulator.  Ofcom will not adjudicate individual complaints (e., it will not determine whether a specific piece of content should or should not be removed), but instead will oversee companies’ overall compliance with the statutory duty of care, including by auditing service providers’ systems and processes for adjudicating user complaints.  The Response does not clearly define the sanctioning powers that will be available to Ofcom, but it suggests that these may include the power to issue fines, impose liability on senior managers and, in certain circumstances, require companies to improve systems or even engage in measures like ISP blocking.
  • Age verification requirements. The Response indicates that in-scope service providers will need to implement appropriate age verification technologies to prevent children from being exposed to inappropriate content.  It is not clear, however, how widely this obligation will apply—g., whether it would apply only to services likely to be accessed by children (similar to the standard set out in the ICO’s recent Age Appropriate Design Code of Practice), or to all services within scope.
  • Transparency requirements. Depending on the type of service and risk factors involved, in-scope service providers will need to adopt certain transparency measures.  Ofcom will have the power to require companies to submit annual reports explaining, among other things, the types and prevalence of potentially harmful content on their services, as well as information on the effectiveness of the company’s enforcement procedures.  The UK’s multi-stakeholder Transparency Working Group, chaired by the Minister for Digital and Broadband, will also provide input.

The Response also reflects the Government’s intention to publish interim, voluntary codes of practice on how online service providers should address terrorist and child sexual abuse content available on their services.  The Government hopes to develop these codes and related non-legislative measures with input from affected stakeholders.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Marty Hansen Marty Hansen

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues. Martin has extensive experience in advising clients…

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues. Martin has extensive experience in advising clients on matters arising under EU and U.S. law, UK law, the World Trade Organization agreements, and other trade agreements.

Photo of Paul Maynard Paul Maynard

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online…

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.

Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.