On December 18, 2020, the Irish Data Protection Commission (“DPC”) published its draft Fundamentals for a Child-Oriented Approach to Data Processing (the “Fundamentals”). The Fundamentals introduce child-specific data protection principles and measures, which are designed to protect children against data processing risks when they access services, both online and off-line. The DPC notes that all organizations collecting and processing children’s data should comply with the Fundamentals. The Fundamentals are open for public consultation until March 31, 2020.
On Wednesday, January 13, the Supreme Court heard arguments in AMG Capital Management LLC v. Federal Trade Commission. This case raises the question whether the Federal Trade Commission (FTC) has been properly using Section 13(b) of the FTC Act, the provision authorizing requests for preliminary and permanent injunctions where the FTC believes the defendant “is violating, or is about to violate, any provision of law enforced by the Federal Trade Commission,” to obtain monetary relief such as disgorgement or restitution. Covington submitted an amicus brief supporting AMG’s challenge to the FTC’s use of Section 13(b).
Although predicting the outcome of any case from the oral argument is notoriously tricky, a majority of Justices appear to have serious questions about the FTC’s interpretation of the statute. In particular, a number of Justices observed that the FTC’s readiness to resort to the relatively quick Section 13(b) route to a broad remedy award renders “irrelevant” or “superfluous” the specific limitations built into other sections of the statute that expressly deal with monetary recovery, in particular Sections 5(l) and 19. Justices also raised concerns about the use of Section 13(b) in cases in which the theory of liability may not have been clearly articulated in advance by the agency, and whether the FTC’s interpretation of the statute raises separation of powers issues for independent agencies.
A decision will be issued by June of this year. In the meantime, efforts have already been launched in Congress to amend the statute to address the issues raised in this litigation.
On December 16, 2020, the German Federal Government passed a draft law that substantially amends some of Germany’s information technology laws (“IT laws”). These amendments aim to adapt the current legal framework to the increasing digitalization of products and services, the proliferation of IoT products, and the appearance of new cybersecurity threats. The draft law is expected to be enacted in the German Parliament in the first quarter of 2021.
In addition to releasing the new EU Cybersecurity Strategy before the holidays (see our post here), the Commission published a revised Directive on measures for high common level of cybersecurity across the Union (“NIS2”) and a Directive on the resilience of critical entities (“Critical Entities Resilience Directive”). In this blog post, we summarize key points relating to NIS2, including more onerous security and incident reporting requirements; extending requirements to companies in the food, pharma, medical device, and chemical sectors, among others; and increased powers for regulators, including the ability to impose multi-million Euro fines.
The Commission is seeking feedback on NIS2 and the Critical Entities Resilience Directive, and recently extended its original deadline of early February to March 11, 2021 (responses can be submitted here and here).
On 17 December 2020, the Council of Europe’s* Ad hoc Committee on Artificial Intelligence (CAHAI) published a Feasibility Study (the “Study”) on Artificial Intelligence (AI) legal standards. The Study examines the feasibility and potential elements of a legal framework for the development and deployment of AI, based on the Council of Europe’s human rights standards. Its main conclusion is that current regulations do not suffice in creating the necessary legal certainty, trust, and level playing field needed to guide the development of AI. Accordingly, it proposes the development of a new legal framework for AI consisting of both binding and non-binding Council of Europe instruments.
The Study recognizes the major opportunities of AI systems to promote societal development and human rights. Alongside these opportunities, it also identifies the risks that AI could endanger rights protected by the European Convention on Human Rights (ECHR), as well as democracy and the rule of law. Examples of the risks to human rights cited in the Study include AI systems that undermine the right to equality and non-discrimination by perpetuating biases and stereotypes (e.g., in employment), and AI-driven surveillance and tracking applications that jeopardise individuals’ right to freedom of assembly and expression.
On January 5, 2021, the Council of the European Union released a new, draft version of the ePrivacy Regulation, which is meant to replace the ePrivacy Directive. The European Commission approved a first draft of the ePrivacy Regulation in January 2017. The draft regulation has since then been under discussion in the Council.
On January 1, 2021, Portugal took over the presidency of the Council for six months. Ahead of the next meeting of the Council’s working party responsible for the draft ePrivacy Regulation, the Portuguese Presidency issued a revised version of the draft regulation. This is the 14th draft version of the ePrivacy Regulation (including the European Commission’s first draft).
Once approved, the ePrivacy Regulation will set out requirements and limitations for publicly available electronic communications service providers (“service providers”) processing data of, or accessing devices belonging to, natural and legal persons “who are in the [European] Union” (“end-user”). The regulation aims to safeguard the privacy of the end-users, the confidentiality of their communications, and the integrity of their devices. These requirements and limitations will apply uniformly in all EU Member States. However, EU Member States have the power to restrict the scope of these requirements and limitations where this is a “necessary, appropriate and proportionate measure in a democratic society to safeguard one or more of the general public interests.” Continue Reading
Judge Freeman of the U.S. District Court for the Northern District of California dismissed a class action against Google and several YouTube channel owners alleging various violations under California state law. Plaintiffs alleged Defendants infringed their children’s privacy and consumer rights by collecting personal information and delivering targeted advertisements while they viewed child-directed YouTube videos. However, the court found that Plaintiffs’ claims were expressly preempted by the federal Children’s Online Privacy Protection Act (“COPPA”), and dismissed the case with leave to amend. Continue Reading
Last year, Californians passed proposition 24, also known as the California Privacy Rights Act (“CPRA”). That law makes several changes to the California Consumer Privacy Act (“CCPA”), including some that relate to an organization’s cybersecurity practices. Continue Reading
On December 24th, with a year-end deadline and the holidays fast approaching, European Commission and United Kingdom (“UK”) officials announced they reached a deal on the EU-UK Trade and Cooperation Agreement (“Agreement”). Once formally adopted by the European Union (“EU”) institutions, the Agreement will govern the relationship between the EU and UK beginning on January 1, 2021, following the end of the Brexit transition period.
The Agreement is likely to avert a year-end scramble to secure cross-border data transfers between the EU and the UK. Although the final text has not yet been published, a UK government summary of the deal indicates that the parties agreed to allow for the continued free flow of personal data for up to six months to allow time for the EU and UK to adopt mutual “adequacy decisions,” in which each jurisdiction may recognize the other as offering adequate protection for transferred personal data. Absent these adequacy decisions (and the interim period established by the Agreement), organizations would need to consider implementing additional safeguards, such as standard contractual clauses, to transfer personal data between the EU and UK. Continue Reading