Earlier this month the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) released its Draft NISTIR 8267, Security Review of Consumer Home Internet of Things (IoT) Products, for public comment. NIST will accept public comments on the report through November 1, 2019. Continue Reading
On October 23, 2019, the European Commission (“Commission”) published its Report on the third annual review of the EU-U.S. Privacy Shield (“Privacy Shield”) (the Report is accompanied by a Staff Working Document). The Report “confirms that the U.S. continues to ensure an adequate level of protection for personal data transferred under the Privacy Shield” (see also the Commission’s Press Release). The Report welcomed a number of improvements following the second annual review, including efforts made by U.S. authorities to monitor compliance with the framework, as well as key appointments that have been made in the last year. The Commission in particular noted the appointment of Keith Krach to the position of Privacy Shield Ombudsperson on a permanent basis, filling a vacancy that had been noted in previous reviews. The Report also provided a number of recommendations for further improvement and monitoring.
Recognizing that, in its third year, Privacy Shield has “moved from the inception phase to a more operational phase,” the Report placed particular emphasis on the effectiveness of the “tools, mechanisms and procedures in practice.” Not only has the number of Privacy Shield certifications exceeded 5,000 companies — eclipsing in three years the number of companies that had registered to the Safe Harbor Framework in its nearly 15 years of existence — the Report also noted that “an increasing number of EU data subjects are making use of their rights under the Privacy Shield and that the relevant redress mechanisms function well.”
As with prior reviews, the Commission sought feedback from trade associations, NGOs, and certified companies, and addressed the functioning of (i) the framework’s commercial aspects, and (ii) U.S. authorities’ access to personal data.
On October 22, 2019, the Federal Trade Commission reached a proposed settlement with the developer of three so-called “stalking” apps that enabled purchasers of the app to secretly monitor the mobile devices on which they were installed. Developer Retina-X Studios, LLC and its owner James N. Johns marketed the three apps—MobileSpy, PhoneSheriff, and TeenShield—as a means to monitor children and employees by sharing detailed information about these individuals’ smart phone activities, including their text messages and GPS locations. The FTC complaint alleges that the developer failed to ensure that the apps would be used for legitimate and lawful purposes, did not secure personal information collected from children and other users, and misrepresented the extent to which that information would be kept confidential.
While the FTC settlement represents its first case against developers of tracking apps, the complaint’s allegations rely on provisions of the FTC Act that are broadly applicable to companies that collect, store, and/or monitor users’ personal information, as well as the Children’s Online Privacy Protection Act (“COPPA”): Continue Reading
On October 17, Senator Ron Wyden introduced in the Senate a privacy bill that would expand the FTC’s authority to regulate data collection and use, allow consumers to opt out of data sharing, and create civil and criminal penalties for certain violations of the Act.
The Mind Your Own Business Act of 2019 is the latest iteration of Wyden’s discussion draft that he released last November. (We provided an overview of the draft bill here.) Although the two Wyden measures are largely similar, the new bill provides for additional enforcement mechanisms and levies taxes on companies whose executives violate reporting requirements.
On October 16, 2019, the body of German Supervisory Authorities known as the Datenschutzkonferenz (“DSK”) released a document proposing a model for calculating fines under the GDPR. The DSK indicated that this model is subject to change and will be superseded by any method put forward in guidance issued by the European Data Protection Board.
The document contains:
- a method to assign a value to the seriousness of an offense; and
- a method to calculate the amount of the fine in light of the seriousness of the offense.
Seriousness of the infringement. Based on the factors set out in Art. 83(2) of the GDPR, the DSK proposes to classify an offense as minor, medium, serious or very serious. The method assigns to each classification a range of values from which a Supervisory Authority can choose (for example, if an infringement is serious pursuant to Art. 83(5) or (6), the Supervisory Authority can assign a value of between 8 and 12). This number will then be used in step 4 of the calculation methodology described below.
Calculation of the fine. According to the DSK’s proposal, fines should then be calculated on the basis of the following 5 steps:
- a Supervisory Authority should start by reviewing the undertaking’s annual turnover in the preceding financial year to classify it according to its size as a micro (A), small (B), medium (C) or large (D) undertaking and assign it to a specific sub-group (for example, A.II covers micro undertakings with a turnover between € 700.000 and € 1.4 million);
- the Supervisory Authority should then determine the average annual turnover of the respective sub-group (in the above example, for an undertaking classified as A.II, the allocated average turnover would be € 1,050,000);
- then, the Supervisory Authority should divide the average annual turnover of the respective subgroup by 360 to determine the “basic economic value of the undertaking” (in the above example, the basic economic value is € 2,917);
- the “basic economic value” is then multiplied by the value of the seriousness of the infringement as described above;
- finally, the amount obtained through this multiplication is adjusted in light of “other circumstances not yet taken into account” (the DSK’s proposal is not more specific on this point).
Unfortunately, the DSK proposal does not address in detail the meaning of “undertaking” in Art. 83(4) and (5) when a company belongs to a corporate group and how the relevant annual turnover of an “undertaking” should be calculated. In this respect, the guidance refers to recital 150 and provides that “undertaking” has the meaning given to it under Articles 101 and 102 of the TFEU, i.e., “a functional meaning of undertaking”.
This is in line with what the DSK stated in another guidance on GDPR sanctions (available here in German):
“(…) the DS-GVO provides a concept of undertaking that is broader than that of Art. 4(18) GDPR. The term “undertaking” in the context of enforcement proceedings is to be inferred from Recital 150 of the GDPR. According to this recital, the broad, functional concept of enterprise borrowed from antitrust law in accordance with Articles 101 and 102 of the Treaty on the Functioning of the European Union (TFEU) applies. The consequence of this is that parent companies and subsidiaries are regarded as an economic unit, so that the total turnover of the group of companies is taken as the basis for calculating the fine.“
The Council of EU Member States – one of the two main EU lawmaking bodies – recently released a new draft version of the ePrivacy Regulation (“EPR”). Negotiations on the regulation have been deadlocked for a while, but seem to be gathering new momentum under the Finnish Presidency. Below we highlight some selected topics that may be of interest to readers.
- Users will have to be reminded (probably every 12 months) of their right to withdraw their consent to the processing of electronic communications content or metadata, unless users request not to receive these reminders. This does not apply to consent for cookies or direct marketing by e-mail or SMS.
- Member States continue to reserve the right to implement data retention obligations, for example, for law enforcement purposes. This remains a controversial topic in light of past and pending CJEU case law.
- The consent requirements for cookies do not materially change, although the derogations are more clearly defined; they now include audience measuring and software updates, among others, under certain conditions. In the draft, it is clear that the consent must be a GDPR-consent, which is in line with the recent CJEU Planet49 decision, but the draft also explicitly indicates that consent can be obtained by “appropriate” technical settings of software.
- Recital 21 addresses the issue of cookie walls (e., subjecting a service to consent for cookies used for advertising purposes). The current draft suggests that this is indeed possible and that the required consent (users must “accept such use”) should not be considered an invalid (tied) consent under Art. 7(4) GDPR when the processing for advertising is “necessary” for the performance of the service. In other words the acceptance is freely given. However, the tortured language of the recital demonstrates its political sensitivity – e.g., the recital refers to accept, not “consent”.
- Direct marketing by e-mail or SMS for own products and services to existing customers would still be based on legitimate interest with a right to opt-out. However, Member States could set an expiration time on this, following which the relevant party would presumably have to seek an opt-in consent if it wants to continue sending advertising. This risks creating a patchwork of un-harmonized marketing rules across the EU, despite having an EU-wide regulation.
- Electronic communications metadata can be used for scientific research, without consent, under certain conditions. Interestingly, under the most recent version of the EPR, these conditions no longer require that the research be based on Union or Member State law ( a contrario Art. 9(2)(j) GDPR). This is a welcome change, given that these laws do not exist in most cases.
On October 3, 2019, the United States and United Kingdom signed an agreement on cross-border law enforcement demands for data from service providers (“Agreement”). The Agreement is the first bilateral agreement to be entered under the Clarifying Lawful Overseas Use of Data (CLOUD) Act. It obligates each Party to remove barriers in their domestic laws so that U.S. and U.K. national security and law enforcement agencies may obtain certain electronic data directly from Communications Service Providers (“CSPs”) located in the jurisdiction of the other Party. The Agreement will go into effect 180 days after its transmission to Congress by the Attorney General, unless Congress disapproves by joint resolution.
On October 10th, California state attorney general Xavier Becerra announced the release of proposed implementing regulations concerning the California Consumer Privacy Act (CCPA).
A new ballot initiative would create the California Privacy Rights and Enforcement Act (“CPREA”) and would make several changes to the California Consumer Privacy Act (“CCPA”).
The CJEU decision focuses on the second pre-ticked box used to obtain consent for cookies and, in particular, on whether it met the requirements for unambiguous and specific consent.
The CJEU decided that consent obtained using a pre-ticked box is not valid because it does not meet the requirement for an affirmative consent imposed by the ePrivacy Directive, the Data Protection Directive and, now, the GDPR. According to the CJEU, the use of a pre-ticked box makes it “practically impossible to clarify in an objective manner whether the user of a website has actually given his consent to the processing of his personal data (…),” and “[i]t cannot be ruled out that the user may not have read the information attached to the checkbox or that he may not have noticed this box before continuing his activity on the website he visited” (Para. 55).
On the specificity of the consent, the CJEU decided that the consent could not be obtained by actively clicking on the “participate” button, since from that action one cannot “assume that the user has given his effective consent to the storage of cookies” (Para. 59). This suggests that the CJEU would also consider implied consents (such as consents derived from a continued use of the service) to be unacceptable.
The CJEU expressly declined to decide on the “freely given” nature of the consent since this was not included in the questions submitted by the German Federal Court of Justice.
The CJEU was also asked to decide on whether the requirement to obtain consent for cookies applied only if these cookies were used to collect personal data. In this regard, the CJEU clarified that the requirement under the ePrivacy Directive to obtain consent applies “to ‘the storage of information’ and ‘access to information already stored’ without specifying that information or specifying that it must be personal data”. However, the CJEU noted that in the case at hand, the collected data was personal data because the cookies stored in the terminal equipment of a user assigned a number to each user which was linked to the registration data.
Finally, the court decided that, as part of the “comprehensive information” that must be provided to users, such users must be informed of the duration of the cookies and about whether third parties can access them. The court did not say that all the third parties must be individually identified.