Dutch Supervisory Authority Imposes GDPR Security Standard for Processing Broadly Defined Health Data

In early November, the Dutch Supervisory Authority released an injunction imposed against the public insurance body Uitvoeringsinstituut Werkgeversverzekering (“UWV”) last July.

The UWV allows employers to submit data about their employees for social security purposes.  The data includes dates of employee absences due to general illness (and when an employee is pregnant or gave birth, including dates of associated absences and parental leave).  While the actual illness is not disclosed, the Supervisory Authority held that the data must be qualified as health data because the mere fact that someone is ill is indicative of their health.

In addition, the Supervisory Authority holds that the UWV violated the security standard of the GDPR by only applying one-factor authentication (e-mail address and password) on its portal.  According to the Authority, state-of-the-art security for a platform with this level of risk requires multi-factor authentication.  The Authority relies on Dutch guidelines for public authorities offering digital services and the Dutch NEN-7510 security standard for the health sector.

The UWV was ordered to conduct a new privacy impact assessment by October 1, 2018, and to implement appropriate security by October 31, 2019, with a penalty of €150,000 for each month delay (with a maximum of €900,000).  The long transition period for improving its security is explained by delays in the roll-out of a standardized authentication tool for public bodies.

Illinois Supreme Court to Decide Statutory Standing Requirements Under the Illinois Biometric Information Privacy Act

On November 20, 2018, the Illinois Supreme Court heard oral arguments in Rosenbach v. Six Flags Entertainment Corporation et al., a case arising under the Illinois Biometric Information Privacy Act, 740 ILCS 14/1 et seq. (“BIPA”).  BIPA provides a private right of action for persons “aggrieved by a violation of [the] Act.”  The crux of the issue presented to the Illinois Supreme Court is the meaning of “aggrieved by” under BIPA–in other words, what harm is sufficient to satisfy statutory standing requirements underlying BIPA’s private right of action?

Continue Reading

NTIA Publishes Stakeholder Comments on Consumer Privacy Proposal

Last week, the National Telecommunications and Information Administration (“NTIA”) released submissions it had received from the Federal Trade Commission (“FTC”) staff and many other parties on NTIA’s proposed framework for advancing consumer privacy while protecting innovation.  Although NTIA did not request comments on a possible federal privacy bill, most submissions took the opportunity to inform NTIA of what such a federal privacy bill should look like.

Continue Reading

CNIL imposes GDPR-consent in online advertising space

On November 9, 2018, the French Supervisory Authority for Data Protection (known as the “CNIL”) announced that it issued a formal warning (available here) ordering the company Vectaury to change its consent experience for customers and purge all data collected on the basis of invalid consent previously obtained.

 

Vectaury is an advertising network that buys online advertising space on behalf of its customers (advertisers).  The company also offers a software tool that advertisers can integrate into their apps to collect geolocation data and information on the device and browser of users.  The company analyses this data, compares it with certain geographic points of interest (e.g., physical stores) and creates profiles of users’ habits.  Based on these profiles, the company organizes targeted advertising campaigns on behalf of advertisers.  It also tracks users while they are in the physical stores of the advertisers in order to assess the effectiveness of advertising campaigns.

 

The consent mechanism offered by the apps provided a short notice explaining that the application collects the users’ browser history and geographic location for the purpose of targeted marketing.  It offered users three options: to accept, to refuse or to customize their preferences.  According to the CNIL, the consent collected through the tool does not comply with three of the GDPR requirements for consent.

 

  • First, the CNIL found that the consent was not informed because the information provided was unclear, used complex terms, and was not easily accessible (particularly the list of the third-party entities receiving the data).

 

  • Second, the consent obtained at the time of the installation of the application was not sufficiently specific because it only gave users the option to consent or to refuse. Users were not asked to specifically consent to the processing of their geolocation data for targeted marketing purposes.

 

  • Third, the CNIL pointed out that the consent obtained through the tool was not based on an affirmative action. Users selecting “customize my preferences” were directed to a separate pop-up with pre-checked options.

 

During the CNIL’s investigation, Vectaury implemented the “Consent Management Platform” tool developed by the Interactive Advertising Bureau.  However, the CNIL found that the information provided and consent obtained by this tool also did not meet the requirements for consent set out by the GDPR.

 

This is yet another enforcement action by the CNIL against an online marketing company, and the high standard applied by the CNIL is something to be reckoned with. Although Vectaury had a consent experience in place, allowed users to refuse to give their consent, and even provided granular preferences to the user, this was still not enough.  Interestingly, as on previous occasions, the CNIL does not seem to have investigated the advertisers who have incorporated these tools in their apps.

Supreme Court to Hear Case Regarding Deference to Federal Agencies on Statutory Interpretation

Yesterday, the Supreme Court granted certiorari in Carlton & Harris Chiropractic, Inc. v. PDR Network, LLC, No. 17-1705.  The case began when Carlton & Harris sued PDR Network for alleged violations of the commercial fax provisions of the Telephone Consumer Protection Act (“TCPA”).  The Fourth Circuit ruled in Carlton & Harris’s favor, relying on an interpretation of the TCPA issued by the Federal Communications Commission (“FCC”).  The Supreme Court granted PDR Network’s petition to review that ruling, but limited its review to a single question: whether federal courts are bound to accept a federal agency’s interpretation of a statute such as the TCPA without considering the validity of that interpretation.  The case has important implications for administrative law that are not limited to the TCPA or to the FCC.

The facts in the case are typical of many TCPA fax cases.  Carlton & Harris, a chiropractic office, received a fax from PDR Network offering free copies of an e−book.  Carlton & Harris sued, alleging that the fax was an unsolicited advertisement transmitted without consent in violation of the TCPA.  PDR Network moved to dismiss the complaint, arguing that the fax could not constitute an unsolicited advertisement because it did not offer anything for sale.

In deciding whether the fax was an unsolicited advertisement, the district court considered whether it had to defer to a 2006 FCC rule in which the Commission interpreted the statutory term “unsolicited advertisement” to include faxes that promote free goods or services.  Carlton & Harris argued that the district court lacked jurisdiction to disagree with or set aside the FCC’s interpretation because of the Hobbs Act, 28 U.S.C. § 2342.  The Hobbs Act is a federal statute that vests “exclusive jurisdiction” with federal courts of appeal to determine the validity of rules and orders issued by the FCC, the Department of Agriculture, the Department of Transportation, and other federal agencies.  The district court held that the Hobbs Act did not prevent it from assessing the validity of the FCC’s 2006 TCPA rule, which the court determined was contrary to the TCPA’s unambiguous text, and thus did not defer to the FCC’s interpretation.  The district court then concluded that the defendant’s fax was not an unsolicited advertisement and dismissed the complaint.

Carlton & Harris appealed, and a divided Fourth Circuit panel reversed the district court’s dismissal order.  The Fourth Circuit held that the district court exceeded its jurisdiction because the Hobbs Act deprived the court of authority to review the FCC rule.  Instead, the majority held, the Hobbs Act requires district courts to defer to an agency’s interpretation (such as the FCC’s interpretation of the TCPA), regardless of whether that interpretation is faithful to the statutory text or suffers from other infirmities.  Under the Fourth Circuit’s view, if the defendant wanted to challenge the FCC’s 2006 TCPA rule, its only recourse was to seek judicial review before a circuit court within the 60-day period allotted for appeals under the Hobbs Act, 28 U.S.C. § 2344.  And because that time had lapsed, both the district court and the Fourth Circuit were bound by the FCC’s interpretation and lacked authority to independently interpret the statute.  After applying the FCC’s interpretation, the Fourth Circuit went on to conclude that PDR Network’s fax was an unsolicited advertisement.

The Supreme Court’s review is limited to the question of whether the Hobbs Act stripped the district court and the Fourth Circuit of authority to review the validity of the FCC’s 2006 TCPA rule, including whether that rule comports with the statutory text.  The separate question whether PDR Network’s fax was an unsolicited advertisement is not before the Court—although that issue could be reopened on remand if the Supreme Court rules in PDR Network’s favor.

Because the Hobbs Act applies to rules and orders issued by a broad range of federal agencies, the Supreme Court’s ruling in PDR Network will likely have a significant effect beyond the FCC and the TCPA.  Indeed, the Supreme Court’s decision to review PDR Network is the latest in a recent string of cases questioning whether courts should defer to legal interpretations by federal agencies; and several Justices have signaled that they plan to revisit the validity of major agency‑deference doctrines, including Chevron v. Natural Resources Defense Council, Inc., 467 U.S. 837 (1984) (granting dereference to agencies’ statutory interpretations), and Auer v. Robbins, 519 U.S. 452 (1997) (granting deference to agencies’ interpretations of their own regulations).  The Court’s ruling in PDR Network therefore could potentially open the door to more frequent challenges to agency rules and interpretations, particularly in the scope of enforcement suits by private parties.

The Court is likely to hear oral argument in PDR Network in March 2019, and a ruling is expected by June 2019.

European Regulators Are Intensifying GDPR Enforcement

Earlier this year, in the run-up to the General Data Protection Regulation’s (“GDPR”) May 25, 2018 date of application, a major question for stakeholders was how zealously the GDPR would be enforced.  Now, as the GDPR approaches its six-month birthday, an answer to that question is rapidly emerging.  Enforcement appears to be ramping up significantly.  In this post, we set out some of the most prominent regulatory enforcement developments so far — but bear in mind other investigations are also proceeding.

  • In late October, the UK Supervisory Authority (“ICO”) served an enforcement notice on AggregateIQ Data Services (“AIQ”) that required erasure of all personal data held by AIQ that related to UK individuals.  The notice also specifies that if AIQ does not comply, the ICO would impose the maximum GDPR penalties of €20 million or 4% of AIQ’s total annual worldwide turnover, whichever is higher.
  • In early October, the Irish Supervisory Authority (“DPC”) announced an investigation into Facebook for a potential data breach.
  • In mid-September, the Austrian Supervisory Authority (“DSB”) issued a fine of €4,800 under the GDPR to an entrepreneur who reportedly installed a CCTV camera that recorded a significant portion of public pavement beyond their business premises.  The DSB also recently disclosed that over 100 fine proceedings were underway, and that it had received over 700 complaints (the first, we understand, was received from the well-known privacy activist, Max Schrems).
  • In mid-July, the Portuguese Supervisory Authority (“CNPD”) fined a hospital €400,000 for breaching the GPDR, reportedly for failing to prevent hospital staff from using false profiles to access patient data.  (We provide more information on this fine, which may be under appeal, on our blog here.)
  • In mid-July, the Italian Supervisory Authority (“Garante”) served an enforcement notice on two companies (Faiella Nicola Srl and Visirun SpA) in relation to location monitoring systems used in company vehicles.  In particular, the Garante required the companies to take further steps to ensure compliance with the GDPR, including that the companies must (i) provide monitored employees with an option of deactivating the monitoring system during their break periods and outside of working hours; (ii) install an informational sticker to the window of each vehicle with the system; and (iii) ensure only small numbers of employees are permitted to access the relevant location data.
  • In late June, the French Supervisory Authority (“CNIL”), issued warnings to two companies, Teemo and Fidzup, for issues connected with their provision of platforms to mobile apps that enabled targeted advertising through the use of location data.  (We provided more in-depth information on this development on our blog here.)

The CNIL Publishes Report On Blockchain and the GDPR

On November 6, 2018, the French data protection authority (the “CNIL”) published a report that discusses some of the questions raised by the use of blockchain technology and perceived tensions between it and foundational principles found in the General Data Protection Regulation (the “GDPR”).  As we noted in an earlier blog post on this topic, some pundits have claimed that certain features of blockchain technology, such as its reliance upon a de-centralised network and an immutable ledger, pose GDPR compliance challenges.  The CNIL has attempted to address some of these concerns, at least in a tentative manner, and further guidance from EU privacy regulators can be expected in due course.

De-centralised network

The CNIL acknowledges that EU data protection principles have been designed “in a world in which data management is centralised,” and where there is a clear controller of the data (“data controller”) and defined third parties who merely process the data (“data processors”).  Applying these concepts to a de-centralised network such as blockchain, where there are a multitude of actors, leads to a “more complex definition of their role.”  In brief, EU data privacy rules are the square peg to blockchain’s round hole.

Notwithstanding this, the CNIL considers that participants on a blockchain network, who have the ability to write on the chain and send data to be validated on the network, must be considered data controllers.  This is the case, for instance, where the participant is registering personal data on the blockchain and it is related to a professional or commercial activity.  By contrast, according to the CNIL, the miners, who validate the transactions on the blockchain network, can in certain cases be acting as data processors.  As a consequence, data processing agreements would need to be in place between the data controllers and the data processors on any blockchain network.

The CNIL further considers that where there are multiple participants who decide to carry out processing activities via a blockchain network, they will most likely be considered “joint controllers,” unless they identify and designate their roles and responsibilities in advance.   Individuals who use the blockchain for personal use (i.e., individuals who access the network to buy and sell a virtual currency), however, would not be data controllers as they can rely on the “purely personal or household activity” exception.   Continue Reading

Wyden Releases Draft Privacy Bill Increasing FTC Authority, Providing for Civil Fines and Criminal Penalties

Senator Ron Wyden last week released a discussion draft of a federal privacy bill that would amend Section 5 of the Federal Trade Commission Act to expand the FTC’s authority, create significant civil fines, and enforce certain provisions through criminal penalties.

The draft Consumer Data Protection Act is among a growing number of proposals for federal privacy legislation in the United States.  (See our related coverage here and here.)  These federal proposals follow on the EU’s enactment of the General Data Privacy Regulation (“GDPR”), which took effect in May, and the June enactment of the California Consumer Privacy Act (“CCPA”).  The Wyden measure has not yet been introduced in the Senate.

Below we highlight key aspects of the draft legislation.

Continue Reading

Canadian Privacy Commissioner Releases Official Guidance as Data Breach Law Takes Effect

Canada’s new data breach law, The Personal Information Protection and Electronic Documents Act (“PIPEDA”), took effect on November 1. Official guidance released by the country’s Privacy Commissioner explains a few of the law’s key provisions that will affect organizations, specifically, breach reporting and notification obligations, their triggers, and record retention.

Reporting & Notification Obligations

Under the new law, an organization must report and notify individuals of a data breach involving personal information under its control if it reasonably determines the breach creates a “real risk of significant harm” to an individual, regardless of the number of individuals affected. (The guidance states a covered breach that affects only one individual would nonetheless require reporting and notification.) Importantly, the organization that controls the data is required to report and notify individuals of the breach—the guidance clarifies that even when an organization has transferred data to a third-party processor, the organization remains ultimately responsible for reporting and notification. The guidance encourages organizations to mitigate their risk in the event their third-party processor faces a breach by entering sufficient contractual arrangements.

Notification to individuals must be given “as soon as feasible” after the organization has determined a covered breach has occurred. The guidance states the notification must be conspicuous, understandable, and given directly to the individual in most circumstances. It must include enough information to communicate the significance of the breach and allow the those affected to take any steps possible to reduce their risk of harm. The regulations further specify the information a notification must include. In certain circumstances, organizations are also required to notify governmental institutions or organizations of a covered breach; for example, an organization may be required to notify law enforcement if it believes it may be able to reduce the risk of harm.

Continue Reading

NIST Begins Developing a Voluntary Online Privacy Framework

The Department of Commerce’s National Institute of Standards and Technology (“NIST”) announced in early September intention to create a Privacy Framework.  This Privacy Framework would provide voluntary guidelines that assist organizations in managing privacy risks.  The NIST announcement recognized that the Privacy Framework is timely because disruptive technologies, such as artificial intelligence and the internet of things, not only enhance convenience, growth, and productivity, but also require more complex networking environments and massive amounts of data.

Continue Reading

LexBlog