On 10 September 2020, the European Commission proposed an interim regulation designed to enable online communications service providers to combat child sexual abuse online. Once in force, this regulation will provide a legal basis for providers to voluntarily scan communications or traffic data on their services for the limited purpose of detecting child sexual abuse material online.

When the European Electronic Communications Code comes into force on 21 December 2020, it will replace the definition of electronic communications services in Directive 2002/58/EC (the ePrivacy Directive) with a new definition that includes online communications services—referred to as “number-independent interpersonal communications services.” Services such as voice-over-IP, webmail, and messaging services will at that point fall within scope of the ePrivacy Directive.

Articles 5(1) and 6 of the ePrivacy Directive impose strict confidentiality obligations on service providers processing communications and traffic data. The Directive does not provide an explicit legal basis for service providers to scan these data for the purpose of detecting child sexual abuse material. As a result, current practices employed by many online service providers to detect and prevent the dissemination of such material may breach their obligations under the ePrivacy Directive.

The proposed regulation proposes a “temporary and strictly limited” derogation from Articles 5(1) and 6 of the Directive, with the sole objective of enabling these service providers to use technologies for the processing of data “to the extent necessary to detect and report child sexual abuse online and remove child sexual abuse material on their services.” The proposed regulation also contains a number of rules on how such technologies may be deployed and imposes reporting obligations as well. The GDPR will continue to apply to any processing that falls within the scope of this derogation.

Once approved by the European Parliament and the Council, the regulation will be in force from 21 December 2020 until 31 December 2025. Meanwhile, the Commission recently announced plans to introduce legislation by the second quarter of 2021 requiring online service providers to detect child sexual abuse material and report it to public authorities. If this legislation enters into force before 31 December 2025, it should repeal the interim regulation.

 

This post was written with assistance from Stacy Young, a trainee solicitor in the London office. 

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as Privacy International and the European security agency, ENISA.

Photo of Sam Jungyun Choi Sam Jungyun Choi

Sam Jungyun Choi is an associate in the technology regulatory group in the London office. Her practice focuses on European data protection law and new policies and legislation relating to innovative technologies such as artificial intelligence, online platforms, digital health products and autonomous…

Sam Jungyun Choi is an associate in the technology regulatory group in the London office. Her practice focuses on European data protection law and new policies and legislation relating to innovative technologies such as artificial intelligence, online platforms, digital health products and autonomous vehicles. She also advises clients on matters relating to children’s privacy and policy initiatives relating to online safety.

Sam advises leading technology, software and life sciences companies on a wide range of matters relating to data protection and cybersecurity issues. Her work in this area has involved advising global companies on compliance with European data protection legislation, such as the General Data Protection Regulation (GDPR), the UK Data Protection Act, the ePrivacy Directive, and related EU and global legislation. She also advises on a variety of policy developments in Europe, including providing strategic advice on EU and national initiatives relating to artificial intelligence, data sharing, digital health, and online platforms.