German Telecommunications Company Fined 9.5 Million Euros for GDPR Violation

On December 9, 2019, the German Federal Data Protection Supervisory Authority (BfDI) imposed a 9.55 million Euro fine on the telecommunications company 1&1 Telecom GmbH.  The BfDI found that the authentication procedures used by 1&1’s customer helpline were insufficient and failed to satisfy the requirements of Art. 32 GDPR.  The company announced that it will challenge the order, arguing that the size of the fine is disproportionate.

The BfDI’s investigation was initiated following a complaint by a customer whose mobile telephone number was provided to his former partner in 2018.  The caller provided only the name and birth date of the customer to the helpline worker.  According to the company, the helpline employee acted in accordance with the company’s guidelines at the time, which required a two-factor authentication and were in line with standard industry practices.  But the BfDI found that this procedure created risks for “far-reaching information” on customers.

The BfDI stated that it is currently investigating other telecommunications providers, thereby relying on its own findings in this case and pursuing tips from third parties and customer complaints.

We reported on a German supervisory authority’s guidance regarding a similar topic – the requirements for authentication of data subjects exercising information rights under the GDPR – in an earlier post in July 2019.

German Supervisory Authorities Propose Changes to the GDPR

On December 2, 2019, the German Supervisory Authorities issued a report evaluating the implementation of the EU General Data Protection Regulation (“GDPR”) in Germany.  The report describes the Supervisory Authorities’ experience thus far in applying the GDPR and lists the provisions of the GDPR they see as problematic in practice.  For each of these provisions, the report discusses the perceived problem and proposes a solution.

The report begins by noting that the GDPR has significantly increased the workload of German Supervisory Authorities over the past year and a half.  This is due not only to an “enormous growth” in the number of complaints and consultation requests received, but also additional work resulting from the GDPR’s cross-border cooperation procedure.  Since the increased workload has not always been met with increased resources, the authorities have found it difficult to effectively supervise compliance.  Controllers are apparently aware of this and, as a result, have neglected their duties to be GDPR compliant.

Continue Reading

UK ICO and The Alan Turing Institute Issue Draft Guidance on Explaining Decisions Made by AI

The UK’s Information Commissioner’s Office (“ICO”) has issued and is consulting on draft guidance about explaining decisions made by AI.  The ICO prepared the guidance with The Alan Turing Institute, which is the UK’s national institute for data science and artificial intelligence.  Among other things, the guidance sets out key principles to follow and steps to take when explaining AI-assisted decisions — including in relation to different types of AI algorithms — and the policies and procedures that organizations should consider putting in place.

The draft guidance builds upon the ICO’s previous work in this area, including its AI Auditing Framework, June 2019 Project ExplAIN interim report, and September 2017 paper ‘Big data, artificial intelligence, machine learning and data protection’.  (Previous blog posts that track this issue are available here.)  Elements of the new draft guidance touch on points that go beyond narrow GDPR requirements, such as AI ethics (see, in particular, the recommendation to provide explanations of the fairness or societal impacts of AI systems).  Other sections of the guidance are quite technical; for example, the ICO provides its own analysis of the possible uses and interpretability of eleven specific types of AI algorithms.

Organizations that develop, test or deploy AI decision-making systems should review the draft guidance and consider responding to the consultation. The consultation is open until January 24, 2020.  A final version is expected to be published later next year.

Continue Reading

New E-Privacy Proposal on the Horizon?

On December 3, 2019, the EU’s new Commissioner for the Internal Market, Thierry Breton, suggested a change of approach to the proposed e-Privacy Regulation may be necessary.  At a meeting of the Telecoms Council, Breton indicated that the Commission would likely develop a new proposal, following the Council’s rejection of a compromise text on November 27.

The proposed Regulation is intended as a replacement to the existing e-Privacy Directive, which sets out specific rules for traditional telecoms companies, in particular requiring that they keep communications data confidential and free from interference (e.g., preventing wiretapping).  It also sets out rules that apply regardless of whether a company provides telecoms services, including restrictions on unsolicited direct marketing and on accessing or storing information on users’ devices (e.g., through the use of cookies and other tracking technologies).

Continue Reading

German Constitutional Court Reshapes “Right to be Forgotten” and Expands Its Oversight of Human Rights Violations

In two recent landmark decisions issued on November 6, 2019, the German Constitutional Court (“BVerfG”) presented its unique perspective on the “right to be forgotten” and announced that it will assume a greater role in safeguarding German residents’ fundamental rights from now on.

Continue Reading

Commission Expert Group Report on Liability for Emerging Digital Technologies

On November 21, 2019, the European Commission’s Expert Group on Liability and New Technologies – New Technologies Formation (“NTF”) published its Report on Liability for Artificial Intelligence and other emerging technologies.  The Commission tasked the NTF with establishing the extent to which liability frameworks in the EU will continue to operate effectively in relation to emerging digital technologies (including artificial intelligence, the internet of things, and distributed ledger technologies).  This report presents the NTF’s findings and recommendations.

Continue Reading

UPDATE: AG Opinion in Schrems II Delayed

The Advocate General’s (“AG”) Opinion in Case C-311/18, Data Protection Commissioner v Facebook Ireland and Maximillian Schrems (“Schrems II”), has been delayed until the 19th December 2019.  (The original publication date was set for the week before, on the 12th December.)

The primary question before the European Court of Justice (“ECJ”), and the AG, in Schrems II is whether the European Commission’s standard contractual clauses (“SCCs”) are valid for transfers of personal data to the United States. Given the widespread reliance on the SCCs for data transfers to the United States and other countries around the world, the ECJ’s judgment is likely to have significant ramifications for many organizations.  The AG’s Opinion, while not binding, will likely give an initial indication of where the ECJ will land.

Covington represents the Software Alliance (“BSA”) in Schrems II and in a second case of equal significance, involving a challenge to the EU-U.S. Privacy Shield. That case, Case T-738/16, La Quadrature du Net and Others v Commission (“LQDN”), is currently pending before the EU General Court. Both the Schrems II and LQDN cases could dramatically affect the global business community.

For a re-cap on the oral hearing in Schrems II that took place in July this year, please see our client alert here.  Our team will continue to provide updates as the case develops.

UK ICO Publishes New Guidance on Special Category Data

On November 14, 2019, the UK Information Commissioner’s Office (“ICO”) published detailed guidance on the processing of special category data.  The guidance sets out (i) what are the  special categories of data, (ii) the rules that apply to the processing of special category data under the General Data Protection Regulation (“GDPR”) and UK Data Protection Act 2018 (“DPA); (iii) the conditions for processing special category data; and (iv) additional guidance on the substantial public interest condition, including what is an “appropriate policy document”.

Under the GDPR, stricter rules apply to the processing of special category data, which includes genetic and biometric data as well as information about a person’s health, sex life, sexual orientation, racial or ethnic origin, political opinions, religious or philosophical beliefs, and trade union membership.  As noted in the guidance, there is a presumption that “this type of data needs to be treated with greater care”  because the “use of this data could create significant risks to the individual’s fundamental rights and freedoms”.  This blog post provides a summary of the key takeaways from the ICO’s guidance. Continue Reading

Democratic Senators Introduce the Consumer Online Privacy Rights Act

On November 26, 2019, a group of Democratic senators introduced the Consumer Online Privacy Rights Act (COPRA).  This comprehensive privacy bill—sponsored by Senators Maria Cantwell (D-WA), Brian Schatz (D-HI), Amy Klobuchar (D-MN), and Ed Markey (D-MA)—would grant individuals broad control over their data, impose new obligations on data processing, and expand the FTC’s enforcement role over digital privacy.

“In the growing online world, consumers deserve two things: privacy rights and a strong law to enforce them,” Senator Cantwell explained. “They should be like your Miranda rights—clear as a bell as to what they are and what constitutes a violation.”

Here are some key elements of the bill: Continue Reading

District of Massachusetts Holds that Suspicionless Searches of Travelers’ Electronic Devices at U.S. Ports of Entry Violates the Fourth Amendment

Last week, in Alasaad v. McAleenan, the U.S. District Court for the District of Massachusetts ruled that the Fourth Amendment requires reasonable suspicion that a traveler is carrying contraband in order to search a traveler’s smartphone or laptop at airports and other U.S. ports of entry.  Judge Denise J. Casper’s decision relied on Riley v. California, in which the Supreme Court held that the Fourth Amendment generally requires the government to obtain a warrant to search cell phones incident to arrest, to bar suspicionless or random searches of electronic devices at the border.  Judge Casper reasoned that while “the government’s interest in preventing the entry of unwanted persons and effects is at its zenith at the border,” this interest must be balanced against the “substantial personal privacy interests” implicated by the searches of electronic devices.

The plaintiffs in Alasaad are eleven travelers whose devices were searched, and in some cases confiscated, upon arriving at a U.S. airport from overseas or at a border crossing.  Judge Casper found that federal agents accessed attorney-client communications, information related to a plaintiff’s journalistic activities, and social media postings in carrying out suspicionless searches of the plaintiffs’ cell phones and laptops.  One traveler had “twice had her iPhones searched at the border over her religious objections to having CBP officers, especially male officers, view photos of her and her daughters without their headscarves as required in public by their religious beliefs.”  In another case, federal agents extracted a traveler’s data and retained it for fifty-six days.

Customs and Border Protection (“CBP”) and Immigration and Customs Enforcement (“ICE”) updated their policies in 2018 to require reasonable suspicion or a national security concern to conduct a search—but only for “advanced” “forensic” searches.  In her decision, Judge Casper dismissed the notion that reasonable suspicion should only be required for forensic searches, and not for “basic” manual searches: “a basic search and an advanced search differ only in the equipment used to perform the search and certain types of data that may be accessed with that equipment, but otherwise both implicate the same privacy concerns.”  She noted that electronic devices can contain a large volume of information that can be accessed during even a basic search.  This point differentiates Alasaad from earlier cases where federal courts of appeals have split on whether the government must have some amount of suspicion for forensic searches of devices seized at the border.

Judge Casper rejected the government’s argument that electronic devices could contain information that speaks to a traveler’s admissibility to the United States, noting that the plaintiffs are U.S. citizens and lawful permanent residents, who are admissible “by definition.”  She also expressed skepticism about the government’s argument that requiring reasonable suspicion would “obviate the deterrent effect of the border search exception,” pointing to the lack of information about the prevalence of digital contraband, such as child pornography, entering the United States at ports of entry.

ICE does not track the number of basic searches that it conducts, but CBP alone conducted some 108,000 searches of electronic digital devices in the last six years.  Under Judge Casper’s decision, federal agents at both agencies are required to demonstrate reasonable suspicion before they can search a traveler’s electronic device.

* Covington Participated in the Case as Counsel for Amici Curiae Brennan Center for Justice, the Center for Democracy and Technology, the R Street Institute, and TechFreedom.