Dutch Supervisory Authority releases guidance on the interaction between the GDPR and PSD2

On October 18, 2018, the Dutch Supervisory Authority for data protection adopted guidance on the second Payment Service Directive (“PSD2”).  The PSD2 intends to open the financial services market to a larger scale of innovative online services.  To that effect, the PSD2 sets out rules for obtaining access to the financial information of bank customers.  Among other things, it provides that in most cases service providers’ access to this personal data is subject to consent.

The Supervisory Authority points out that the required consent is an additional protection imposed by the PSD2.  It is not a legal basis for the processing of personal data under the General Data Protection Regulation (“GDPR”).  In fact, under the GDPR the processing should not be based on consent, but rather on an alternative legal basis – namely, the execution of an agreement.  Interestingly, while the regulator acknowledges that PSD2 consent is not a GDPR consent, it applies the same standard to both.  As a result, according to the authority, the consent must be obtained separately from the main agreement (for example, in the form of a pop-up consent request), and customers must be able to retract their consent at any time, an action that would likely result in the end the agreement, since a provider would be unable to process any new data thereafter.

Italian court decides that a data protection officer does not have to be a certified ISO 27001 Auditor

On September 5, 2018, a first instance Administrative Court in Italy decided that a public company cannot reject an application for the position of data protection officer (“DPO”) on the basis that the applicant is not a certified ISO 27001 Auditor / Lead Auditor (decision available here).

ISO 27001 is an international information security standard. The standard sets out conditions that an individual must meet to become a certified ISO 27001 Auditor / Lead Auditor, such as attending dedicated courses and passing an exam.

The court noted the DPO requirements set out the General Data Protection Regulation (“GDPR”), in particular that the “data protection officer shall be designated on the basis of professional qualities and, in particular, expert knowledge of data protection law and practices and the ability to fulfill [its] tasks” (Article 37(5) GDPR).

The court held that an ISO 27001 Auditor / Lead Auditor certification “does not (or does not fully) capture the specific qualities inherent to the task [of DPO], whose main function is not (…) to increase the levels of efficiency and security in the information management, but rather, the ability to safeguard the fundamental right of the individual to the protection of personal data (…)”.  The lack of this certification does not mean that an applicant cannot adequately fulfill the role of a DPO.

China Releases New Regulation on Cybersecurity Inspection

On September 30, 2018, China’s Ministry of Public Security (“MPS”) released the Regulation on the Internet Security Supervision and Inspection by Public Security Organs (the “Regulation”;《公安机关互联网安全监督检查规定》), which will take effect on November 1, 2018.

Continue Reading

The Implications of the GDPR on Clinical Trials in Europe

On October 23, 2018, the European Federation of Pharmaceutical Industries in cooperation with the Future of Privacy Forum and the Center for Information Policy Leadership will organize a workshop entitled, “Can GDPR Work for Health Research.”  In the first session, the workshop will discuss the implications of the General Data Protection Regulation (“GDPR”) on clinical trials in the EU.  The second session is devoted to further use of health data for scientific research.  Among other things, this session will discuss the relationship between the Clinical Trials Regulation (“CTR”) and the GDPR.

The CTR appears to subject further use of clinical trial data (i.e., any use outside the protocol) to consent.  In a note available here, we point out that such a reading is overly restrictive.  At the very least, the derogations in the GDPR for the use of health data for scientific research without consent should continue to apply.

IoT Update: The UK publishes a final version of its Code of Practice for Consumer IoT Security

By Grace Kim and Siobhan Kahmann

Following an informal consultation earlier this year – as covered by our previous IoT Update here – the UK’s Department for Digital, Culture, Media and Sport (“DCMS”) published the final version of its Code of Practice for Consumer IoT Security (“Code”) on October 14, 2018. This was developed by the DCMS in conjunction with the National Cyber Security Centre, and follows engagement with industry, consumer associations, and academia. The aim of the Code is to provide guidelines on how to achieve a “secure by design” approach, to all organizations involved in developing, manufacturing, and retailing consumer Internet of Things (“IoT”) products. Each of the thirteen guidelines are marked as primarily applying to one or more of device manufacturers, IoT service providers, mobile application developers and/or retailers categories.

The Code brings together what is widely considered good practice in IoT security. At the moment, participation in the Code is voluntary, but it has the aim of initiating and facilitating security change through the entire supply chain and compliance with applicable data protection laws. The Code is supported by a supplementary mapping document, and an open data JSON file which refers to the other main industry standards, recommendations and guidance.  Ultimately, the Government’s ambition is for appropriate aspects of the Code to become legally enforceable and has commenced a mapping exercise to identify the impact of regulatory intervention and necessary changes. Continue Reading

Senate Discusses a Federal Privacy Law with Privacy Experts: Examining Lessons From the European Union’s General Data Protection Regulation and the California Consumer Privacy Act

On October 10, the Senate Committee on Commerce, Science, and Transportation held second hearing on data privacy that invited advocates and experts to discuss a federal privacy law. The panelists included Andrea Jelinek, director of the European Data Protection Board; Alastair Mactaggart, chair of Californians for Consumer Privacy; Laura Moy, executive director of the Georgetown Law Center on Privacy and Technology; and Nuala O’Connor, president of the Center for Democracy and Technology. Consistent with the previous hearing on data privacy, the discussion focused on two issues (1) potential components of a federal privacy bill, particularly data breach notification, preemption of state law, and the scope of consumer rights and (2) enforcement authority under a new federal privacy regime.

First, the witnesses generally agreed on the main components to be included in a new federal privacy law.  The witnesses expressed the need for stronger data breach requirements, which was met with enthusiasm from Senators Hassan (D-NH) and Klobuchar (D-MN). Senator Klobuchar asked the witnesses how they would view a 72-hour notification requirement like the one in her proposed bill, the Social Media Privacy Protection and Consumer Rights Act of 2018 (also discussed in a previous Inside Privacy post), and the witnesses generally expressed agreement. Dr. Jelinek added that the General Data Protection Regulation (“GDPR”) requires companies to keep data only as long as it is needed, a requirement that could result in less data being at risk in the event of a breach. Professor Moy noted that the current U.S. regime misaligns data retention incentives because companies have strong financial motivations to keep data as long as possible. She noted that clear rules and effective enforcement are essential to limit the amount of data that can be compromised.

The witnesses generally agreed that a federal privacy law should not be weaker than state privacy laws. Mr. Mctaggart stressed that a federal law must be at least as protective as the California Consumer Privacy Act (“CCPA”). He emphasized that a federal law should be a “floor,” not a “ceiling,” meaning that states could institute additional privacy requirements above those required by the federal law. Ms. O’Connor stressed that a “patchwork of state laws” at the state level and a sectoral approach to protect data based on its type (health data, financial data, children’s data) may have made sense a decade ago, but it now leaves a significant amount of personal information unprotected.

The witnesses also generally agreed on consumer rights related to data. Ms. O’Connor stated in her written testimony that a new federal privacy law should limit some types of data collection and processing to uses germane to the service requested by the user, such as collecting precise location information, biometric information, healthcare information, and children’s information. Further, both Ms. O’Connor and Professor Moy emphasized that a new law should prohibit discrimination using data. As Mr. Mctaggart clarified in response to Senators’ questions, a non-discrimination provision would not prevent consumer loyalty programs, but a price differential between allowing a company to collect data and not allowing a company to collect data under the CCPA cannot be coercive.

Second, the hearing discussion focused on the need for meaningful, effective enforcement. Ms. O’Connor and Professor Moy stressed the need for stronger enforcement in response to questions from Senators Markey, Klobuchar, and Schatz. They both recommended that the FTC be vested with greater authority, including rulemaking power and the ability to levy monetary fines. To support this recommendation, they explained that rulemaking power allows the FTC to be agile as technology changes and as new rules need to be developed. As Professor Moy phrased it, meaningful fines elevate privacy and data security issues to a position of importance for company strategy. In addition, Ms. O’Connor and Professor Moy both stressed that state attorneys generals should also be provided with the power to enforce the federal privacy law. Not only can state attorneys generals enforce smaller violations that do not necessarily rise to the attention of a national enforcer like the FTC, but states attorneys general also have been successful at working to help businesses and communities understand their obligations, Professor Moy stated.

This hearing is expected to be one of an ongoing series of hearings on data privacy hosted by the Senate Committee on Commerce, Science, and Transportation.


New Jersey District Judge Dismisses All Counts Against Smart TVs

On September 26, 2018, New Jersey federal district judge Madeline Cox Arleo dismissed an eight-count class action complaint in its entirety against three smart TV makers: Samsung, LG, and Sony.  The plaintiffs alleged that defendants’ smart TVs continuously monitored and tracked their viewing habits, recorded their voices, and then transmitted that information to defendants’ servers, after which the information was shared with third-party advertisers and content providers.  The judge dismissed all counts:

Federal Law Claims: Plaintiffs made two federal law claims: one under the Video Privacy Protection Act (“VPPA”) and one under the Wiretap Act (which is part of the Electronic Communications Privacy Act, or “ECPA”).

  • VPPA: Under the VPPA, plaintiffs must allege that a Video Tape Service Provider (“VTSP”) “knowingly disclosed” “personally identifiable information” (“PII”) concerning a consumer of such provider.  The statute defines “PII” as “information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider,”  and the Third Circuit construes the VPPA as prohibiting “disclosures of information that would, with little or no extra effort, permit an ordinary recipient to identify a particular person’s video-watching habits.”  See In re Nickelodeon Consumer Privacy Litigation (3d Cir. 2016).  Plaintiffs alleged that Defendants disclosed “extensive information about plaintiffs’ and consumers’ digital identities, namely, consumers’ video-viewing history, consumers’ computer addresses, and information about other devices connected to the same Wi-Fi network.”  The Court held that under In re Nickelodeon, the appropriate standard, plaintiffs failed to allege how an “ordinary recipient” of the data at issue could use it to “identify a particular person” “with little or no extra effort.”
  • Wiretap Act: The Wiretap Act prohibits “interceptions” of electronic communications, but also provides that it is not unlawful for a person to intercept an electronic communication where such a person is a party to that communication.  As such, when plaintiffs alleged that defendants violated the Wiretap Act by intercepting electronic communications (specifically, electronic communications that the defendants’ smart TVs transmitted to plaintiffs, and communications that plaintiffs sent to defendants’ servers), defendants argued that they had not violated the Wiretap Act, among other reasons, because they were parties to the alleged communications.  The court agreed with the defendants, finding that plaintiffs’ focus on whether defendants took plaintiffs’ and consumers “identifying information in real-time” could not overcome the fact that any communications to the smart TV manufacturers would not violate the Wiretap Act.

Other Claims

Plaintiffs also alleged four contract-based claims and two fraud-based claims:

  • Contract-based claims: Plaintiffs’ contract-based claims were for (1) breach of contract, (2) breach of duty of good faith and fair dealing, (3) breach of express warranty, and (4) unjust enrichment.  Defendants argued that the first three claims failed because plaintiffs did not identify any actual contract or specific affirmation, promise, or guarantee made to them by the smart TV manufacturers.  In addition, defendants argued that plaintiffs failed to identify a loss sustained by the plaintiffs or a benefit received by defendants, and therefore failed to state a claim for unjust enrichment.  The court agreed and dismissed all four claims.
  • Fraud-based claims: Plaintiffs’ two fraud-based claims (unfair and deceptive tracking and transmission, and deceptive omissions) were brought under New Jersey’s Consumer Fraud Act.  However, with the plaintiffs being from New York and Florida respectively, the only connection that they alleged between their claims and New Jersey was the defendant smart TV manufacturers’ allegedly “super-massive” presence in New Jersey.  However, the Third Circuit has consistently maintained that a non-resident plaintiff cannot bring a Consumer Fraud Act claim where the sole connection to New Jersey is the defendants’ location, and the court therefore dismissed both fraud claims.

Covington represented Samsung in this case (White, et al. v. Samsung Electronics America, Inc., et al.).

FCC Seeks Comment on Ninth Circuit’s Expansive TCPA Interpretation in Marks

Yesterday, the FCC released a Public Notice seeking comment on a recent decision issued by the U.S. Court of Appeals for the Ninth Circuit in Marks v. Crunch San Diego, LLC, No. 14-56834 (Sept. 20, 2018).  The Public Notice, issued in the context of the FCC’s Telephone Consumer Protection Act (TCPA) reform proceeding, seeks comment on how the FCC should interpret the phrase “automatic telephone dialing system” (ATDS) as that term is used in the TCPA.  In seeking comment, the FCC noted the tension between Marks and the interpretation of that same statutory provision by the U.S. Court of Appeals for the D.C. Circuit in ACA Int’l v. FCC, 885 F.3d 687 (2018).  We previously discussed the ACA Int’l decision here.

In Marks, the Ninth Circuit examined the TCPA’s definition of an ATDS, which is defined in the statute as equipment that has the capacity “to store or produce telephone numbers to be called, using a random or sequential number generator.”  The court found that whether the clause “using a random or sequential number generator” applies to both storing and producing telephone numbers to be called is ambiguous, and it concluded that this clause applies only to “producing” telephone numbers to be called.  The Ninth Circuit therefore concluded that the definition of an ATDS includes equipment that has the capacity to automatically dial stored numbers—regardless of whether a random or sequential number generator is used.

The Ninth Circuit’s decision in Marks can be viewed as conflicting with the D.C. Circuit’s conclusion in ACA Int’l.  In that case, the D.C. Circuit vacated the FCC’s 2015 interpretation of the definition of an ATDS (which was similar to the Ninth Circuit’s) as unreasonably broad.

Comments responding to the FCC’s public notice are due October 17, 2018, with reply comments due October 24, 2018.

IoT and AI Update: California Legislature Passes Bills on Internet of Things, Artificial Intelligence, and Chatbots

The California legislature recently passed three bills meant to address rapidly-developing technologies including the Internet of Things, artificial intelligence (AI), and chatbots.

Internet of Things. At the end of August, California became the first state to promulgate regulations requiring security features for Internet-connected devices. Senate Bill 327 requires that a manufacturer of a connected device equip the device with “reasonable security features” that are (1) appropriate to the nature and function of the device; (2) appropriate to the information it may collect, contain, or transmit; and (3) designed to protect the device and any information contained therein from unauthorized access, destruction, use, modification, or disclosure. Continue Reading

NTIA Requests Comments Regarding Federal Approach to Consumer Privacy

Last week, the National Telecommunications and Information Administration (NTIA) published a request for comments on how it should approach consumer privacy policy.  NTIA noted that federal action is needed because a growing number of countries and U.S. states have adopted distinct policy approaches with respect to consumer privacy, which risks a fragmented regulatory regime that will harm innovation.

Continue Reading