Last week, a California magistrate judge denied federal prosecutors’ application for a search warrant on the grounds that law enforcement cannot force people to unlock their phones using biometric features, such as fingerprints and facial recognition.
On January 21, 2019, the French Supervisory Authority for data protection (“CNIL”) issued a fine of €50 million against Google for violations of the General Data Protection Regulation (“GDPR”) (the decision was published in French here). The CNIL’s decision was triggered by complaints from two non-profit organizations together representing 9974 individuals. The case raises a number of important privacy issues.
In addition, the CNIL maintains that its conclusion is supported by Google, which stated publicly that it would take steps to bolster the decision making power of its Irish main establishment by January 2019. The CNIL appears to have used the May 2018-January 2019 window to intervene and hand down its decision. With no main establishment in the EU, Google LLC could potentially be subject to enforcement by any supervisory authority in the EU where Google has an establishment, including France. The decision demonstrates a willingness by regulators to interpret the “main establishment” concept restrictively, which, for non-EU headquartered companies, could render the one-stop-shop redundant and expose them to enforcement by several authorities.
Second, the decision is vague on how the amount of the fine was calculated. However, the fine is more than €20 million, which means that it is based on the GDPR’s 4% of worldwide turn-over threshold. Given Google’s France’s “limited” turn-over, the fine is clearly based on the turn-over of Alphabet, the holding company. This is interesting. It is well known that the GDPR is unclear as to the basis on which the 4% should be calculated. By using the turn-over of the holding company as a basis, the CNIL is setting the scene for a guaranteed protracted legal battle. For the outcome, we invite readers to continue following this blog for the next three to five years.
In terms of the amount of the fine, the CNIL puts forward four points:
- the nature of the infringement: according to the CNIL, Google has infringed two fundamental principles of data protection: the principle of transparency (i.e., the obligation to inform individuals about the processing of their personal data) and the principle of lawfulness (i.e., the obligation to link each data processing activity to one of the legal bases listed in Article 6 of the GDPR). According to the CNIL, these principles translate into fundamental rights for individuals to keep control over their personal data;
- the duration of the infringement: the CNIL noted that Google’s ongoing infringement was not remedied, notwithstanding the CNIL’s position that the GDPR is violated;
- the scope of the infringement: in calculating the fine, the CNIL took into account Google’s prominent position in the French market of operating systems, the number of individuals who use Google’s services, the amount and variety of personal data processed and the “unlimited” possibility Google has to match data (allowing for “massive and intrusive” processing of the users’ personal data).
- the gain obtained from the infringement: the CNIL takes the position that, in light of the benefits Google derives from its data processing activities (in particular from its online advertising services), Google must pay particular attention that its processing activities comply with the GDPR.
On the substance, the CNIL’s decision focuses on two main aspects: (i) violation of Google’s transparency obligations under the GDPR (specifically under Articles 12 and 13) and (ii) the lack of a legal basis for processing personal data (a requirement under Article 6 GDPR).
Violation of Transparency Obligations
Under the GDPR, a controller must provide individuals information relating to the processing of their data in a “concise, transparent, intelligible and easily accessible form, using clear and plain language”. According to the CNIL, individuals installing the Android software and signing up to a Google account are provided with “scattered” information spread over different policies and notices. The CNIL takes the position that this makes it hard for users to find some of the information required under the GDPR.
According to the CNIL, the information Google provides does not allow users to “sufficiently understand” the particular consequences of Google’s data processing activities, which the CNIL characterizes as “particularly massive and intrusive.” According to the CNIL, the information Google provides about the purposes for processing is “imprecise and incomplete”, and at times contradictory. While the CNIL recognizes Google’s efforts in the last years to make its processing activities more transparent (e.g., through privacy tools such as “Privacy Check-UP” and “Dashboard”), it notes that these mechanisms are only provided at a later stage, when the user has already consented to the processing.
Lack of a Legal Basis
The CNIL is of the opinion that the consent obtained by Google does not meet the requirements for consent under the GDPR. Under the GDPR, consent must be “given by a clear affirmative act establishing a freely given, specific, informed and ambiguous indication” of the individual’s will. According to the CNIL, Google did not provide individuals with sufficient, understandable and accessible information required to make an informed choice. In line with its earlier Vectaury decision, the CNIL also makes the point that Google does not ask for a specific consent for each of its processing activities, but rather allowed users, at a first instance, to either accept or refuse all processing activities. Only if users click on “more options” can they separately accept the individual purposes for processing data. The CNIL also points out that the consent boxes are then pre-ticked by default which reads like an “opt-out” rather than “opt-in”.
The Governor of Massachusetts recently signed House Bill No. 4806 into law, which will amend certain provisions of the state’s data breach notification law. In addition to changing the information that must be included in notifications to regulators and individuals, the amendments will also require entities to provide eighteen months of free credit monitoring services following breaches involving Social Security numbers. The amendments, which will enter into force on April 11, 2019, are discussed in greater detail below. Continue Reading
On 10 January 2018, the European Court of Human Rights (ECtHR) ruled that the Republic of Azerbaijan violated Articles 8 and 10 of the European Convention on Human Rights (ECHR) by failing to adequately investigate claims by an Azerbaijani journalist that she had been the victim of political blackmail. The ECtHR’s ruling follows upon reports of rising concern in the Council of Europe about government mistreatment of journalists across Europe, and in Azerbaijan in particular. Continue Reading
On January 10, 2019, Advocate General Szpunar of the Court of Justice of the European Union (CJEU) released his opinion regarding a 2016 enforcement action carried out by the French Supervisory Authority (CNIL) against Google. In that case, the CNIL ordered Google to de-reference links to webpages containing personal data. According to the CNIL, the de-referencing had to be effective worldwide. Google challenged the CNIL’s decision before the French administrative court, which then referred this matter to the CJEU.
In his opinion, Advocate General Szpunar disagrees with the CNIL’s view on a worldwide application of the “right to be forgotten.” According to Szpunar, the EU Charter’s right to data protection must be balanced against other Charter rights, such as the right of access to information. These rights must be applied with a territorial link to the EU, and cannot be broadly interpreted to apply across the whole world. To that end, Spuznar emphasizes that EU regulators cannot reasonably be expected to make this balancing test for the entire world. Moreover, a worldwide application of the de-referencing obligation would send a “fatal signal” to third countries eager to limit access to information. It could lead to a race to the bottom at the expense of freedom of information in the EU and worldwide. This does not mean that EU data protection law can never have an extra-territorial dimension, but not in this case.
While a worldwide obligation to de-reference is not desirable, Szpunar does believe that Google should be required to make every effort to de-reference the relevant links across the EU (and not just in France). This includes by means of “geo-blocking”, irrespective of the search engine domain used – i.e., a user of Google.com, Google.fr or Google.de should not see the relevant links if it can be established that the user is in the EU (for example, on the basis of the user’s IP address).
The opinion of the Advocate General will now be considered by the CJEU, who is expected to render a decision in a couple of months. The CJEU often follows the general analysis of the Advocate General.
On December 29, 2018, the Northern District of Illinois dismissed a case brought against Google under the Illinois Biometric Information Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”) on standing grounds. Plaintiffs, Lindabeth Rivera and Joseph Weiss, alleged that Google violated BIPA by failing to obtain informed consent from users prior to collecting, storing, and utilizing their biometric information to create “face-geometry scans” from photos uploaded on Google Photos.
Starting next week, the California Department of Justice will hold six public forums on how the state should implement its landmark privacy law, the California Consumer Privacy Act (“CCPA”). Although California enacted the CCPA in June 2018, the state is still in the process of implementing the new legislation, and the public forums “will provide an initial opportunity for the public to participate in the CCPA rulemaking process,” California Attorney General Xavier Becerra announced in a December 19 press release.
Recent years have seen significant amounts of legislative activity related to state data breach notification laws, and 2018 was no exception. Not only did South Dakota and Alabama enact new data breach notification laws in 2018, becoming the last of 50 U.S. states to enact such laws, but other states also enacted changes to existing data breach notification laws during 2018 to expand their scope and implement additional notification requirements. Following up on our global year-end review of major privacy and cybersecurity developments, we’ve summarized the major developments and trends observed with regards to state data breach notification laws over the past year. Continue Reading
On 30 November 2018, the Austrian Data Protection Authority (“DPA”) decided that the website of an online media publisher – which offers users the option to either consent to advertising cookies or pay for a subscription – gives users a free choice that is compatible with the requirements of consent under the GDPR. (The decision is available in German here.)
Background. The Austrian publisher in question set up a functionality on its website whereby users are given the option to either: (i) consent to advertising cookies and receive full access to website’s content; (ii) refuse consent and receive partial access to the website’s content; or, (iii) pay for a subscription to receive full access to the website’s content for 6 euros/ month and not be tracked by any advertising cookies, third-party scripts, or social media plug-ins (unless the user chooses to personally re-activate these features).
Complaint. The complainant argued that the website did not meet the requirements for voluntary consent under the GDPR because (i) the provision of the service was subject to the user’s consent to process personal data, and (ii) the tracking of personal data was technically not necessary for the provision of the service, since the publisher also offered a paid version with no tracking. The complainant further argued that his right to oppose the tracking had been violated since, even after refusing to give consent, a non-essential cookie still operated and could not be opted-out.
Austrian DPA’s Decision and Analysis. The Austrian DPA dismissed the complaint. In its decision, the DPA pointed out that media companies have relied on advertising as a source of revenue for decades, and in the context of online publishing this is often the only source of revenue. The DPA also took note of the fact that the publisher had developed a privacy-conscious product that offered a pay-for-subscription/ tracking-free option for users. Notably, the DPA stated that:
“The requirement of voluntary consent could not lead to media companies having to provide their services free of charge, especially since online advertising without data-based control would not allow refinancing in the current market environment.”
The DPA further explained that involuntary consent occurs when a data subject is placed at a disadvantage. Referring to the Article 29 Working Party’s Guidelines on Consent, the DPA considered the criteria for “disadvantage” in this context, which may exist when there is a risk of deception, intimidation, coercion or significant negative consequences. The DPA found that the subscription option for 6 euros/ month was not a disproportionately expensive alternative, and in any event, users are free to simply choose another online publisher. In the view of the DPA, neither of these possible outcomes constituted a “significant negative consequence.”
Finally, the DPA also addressed the complainant’s argument about the non-essential cookie script which continued to operate after consent was revoked. The DPA found this point was moot because the publisher had fixed this issue during the course of the DPA’s review of the case.
It has been a busy year for privacy and cybersecurity. Here is a look back at the highlights of 2018 and a preview of what 2019 may have in store in the United States, Europe, and China: