Paul Maynard is an associate in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.
Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.
European Commission Adopts Final UK Adequacy Decisions
On June 28, 2021, the European Commission adopted two decisions finding that the UK’s data protection regime provides an “adequate” level of protection for personal data transferred to the UK from the EU. The first decision covers transfers governed by the GDPR, and permits private companies located in the EU to continue to transfer personal data to the UK without the need for additional arrangements (such as the Commission’s new Standard Contractual Clauses (“SCCs”), which we discuss here). The second decision covers transfers under the Data Protection and Law Enforcement Directive, and permits EU law enforcement agencies to continue to transfer personal data to their counterparts in the UK.
Continue Reading European Commission Adopts Final UK Adequacy Decisions
European Commission Publishes Draft UK Adequacy Decisions
On February 19, 2021, the European Commission published two draft decisions finding that UK law provides an adequate level of protection for personal data. The first would allow private companies in the EU to continue to transfer personal data to the UK without the need for any additional safeguards (e.g., the Commission’s standard contractual clauses), while the second would allow EU law enforcement agencies to transfers personal data subject to Directive 2016/680 — the Data Protection and Law Enforcement Directive (LED) — to their UK counterparts.
Continue Reading European Commission Publishes Draft UK Adequacy Decisions
Proposed New EU Cyber Rules Introduce More Onerous Requirements and Extend to More Sectors
In addition to releasing the new EU Cybersecurity Strategy before the holidays (see our post here), the Commission published a revised Directive on measures for high common level of cybersecurity across the Union (“NIS2”) and a Directive on the resilience of critical entities (“Critical Entities Resilience Directive”). In this blog post, we summarize key points relating to NIS2, including more onerous security and incident reporting requirements; extending requirements to companies in the food, pharma, medical device, and chemical sectors, among others; and increased powers for regulators, including the ability to impose multi-million Euro fines.
The Commission is seeking feedback on NIS2 and the Critical Entities Resilience Directive, and recently extended its original deadline of early February to March 11, 2021 (responses can be submitted here and here).…
Continue Reading Proposed New EU Cyber Rules Introduce More Onerous Requirements and Extend to More Sectors
Twitter Fine: a View into the Consistency Mechanism, and “Constructive Awareness” of Breaches
On December 15, 2020, the Irish Data Protection Commission (“DPC”) fined Twitter International Company (“TIC”) EUR 450,000 (USD 500,000) following a narrow investigation into TIC’s compliance with obligations to (a) notify a personal data breach within 72 hours under Article 33(1) GDPR; and (b) document the facts of the breach under Article 33(5) GDPR. The process to investigate these points took a little under two years, and resulted in a decision of nearly 200 pages.
This is the first time that the DPC has issued a GDPR fine as a lead supervisory authority (“LSA”) after going through the “cooperation” and “consistency” mechanisms that enable other authorities to raise objections and the EDPB to resolve disagreements. The delay in the process and details in the EDPB binding resolution suggest that this was a somewhat arduous process. Several authorities raised objections in response to the DPC’s draft report – regarding the identity of the controller (Irish entity and/or U.S. parent), the competence of the DPC to be LSA, the scope of the investigation, the size of the fine, and other matters. Following some back and forth — most authorities maintained their objections despite the DPC’s explanations — the DPC referred the matter to the EDPB under the GDPR’s dispute resolution procedure. The EDPB considered the objections and dismissed nearly all of them as not being “relevant and reasoned”, but did require the DPC to reassess the level of the proposed fine.
Process aside, the DPC’s decision contains some interesting points on when a controller is deemed to be “aware” of a personal data breach for the purpose of notifying a breach to a supervisory authority. This may be particularly relevant for companies based in Europe that rely on parent companies in the US and elsewhere to process data on their behalf. The decision also underlines the importance of documenting breaches and what details organizations should include in these internal reports.
Continue Reading Twitter Fine: a View into the Consistency Mechanism, and “Constructive Awareness” of Breaches
Updated EDPB Guidelines on Consent and Implications for Cookies
On May 4, 2020, the European Data Protection Board (“EDPB”) updated its guidelines on consent under the GDPR. An initial version of these guidelines was adopted by the Article 29 Working Party prior to the GDPR coming into effect, and was endorsed by the EDPB on May 25, 2018.
Continue Reading Updated EDPB Guidelines on Consent and Implications for Cookies
UK Government Publishes Initial Consultation Response on the Online Harms White Paper
On February 12, 2020, the UK Home Office and Department for Digital, Culture, Media & Sport published the Government’s Initial Consultation Response (“Response”) to feedback received through a public consultation on its Online Harms White Paper (“OHWP”). The OHWP, published in April 2019, proposed a comprehensive regulatory regime that would impose a “duty of care” on online services to moderate a wide spectrum of harmful content and activity on their services, including child sexual abuse material, terrorist content, hate crimes, and harassment.
While the Response does not indicate when the Government expects to introduce proposed legislation, it provides clearer direction on a number of aspects of the proposed regulatory framework set out in the OHWP, including:…
Continue Reading UK Government Publishes Initial Consultation Response on the Online Harms White Paper
Centre for Data Ethics and Innovation Publishes Final Report on “Online Targeting”
On February 4, 2020, the United Kingdom’s Centre for Data Ethics and Innovation (“DEI”) published its final report on “online targeting” (the “Report”), examining practices used to monitor a person’s online behaviour and subsequently customize their experience. In October 2018, the UK government appointed the DEI, an expert committee that advises the UK government on how to maximize the benefits of new technologies, to explore how data is used in shaping peoples’ online experiences. The Report sets out its findings and recommendations.
Continue Reading Centre for Data Ethics and Innovation Publishes Final Report on “Online Targeting”
New E-Privacy Proposal on the Horizon?
On December 3, 2019, the EU’s new Commissioner for the Internal Market, Thierry Breton, suggested a change of approach to the proposed e-Privacy Regulation may be necessary. At a meeting of the Telecoms Council, Breton indicated that the Commission would likely develop a new proposal, following the Council’s rejection of a compromise text on November 27.
Commission Expert Group Report on Liability for Emerging Digital Technologies
On November 21, 2019, the European Commission’s Expert Group on Liability and New Technologies – New Technologies Formation (“NTF”) published its Report on Liability for Artificial Intelligence and other emerging technologies. The Commission tasked the NTF with establishing the extent to which liability frameworks in the EU will continue to operate effectively in relation to emerging digital technologies (including artificial intelligence, the internet of things, and distributed ledger technologies). This report presents the NTF’s findings and recommendations.
Continue Reading Commission Expert Group Report on Liability for Emerging Digital Technologies