European Union

On July 17, 2020, the High-Level Expert Group on Artificial Intelligence set up by the European Commission (“AI HLEG”) published The Assessment List for Trustworthy Artificial Intelligence (“Assessment List”). The purpose of the Assessment List is to help companies identify the risks of AI systems they develop, deploy or procure, and implement appropriate measures to mitigate those risks.

The Assessment List is not mandatory, and there isn’t yet a self-certification scheme or other formal framework built around it that would enable companies to signal their adherence to it.  The AI HLEG notes that the Assessment List should be used flexibly; organizations can add or ignore elements as they see fit, taking into consideration the sector in which they operate. As we’ve discussed in our previous blog post here, the European Commission is currently developing policies and legislative proposals relating to trustworthy AI, and it is possible that the Assessment List may influence the Commission’s thinking on how organizations should operationalize requirements relating to this topic.
Continue Reading AI Update: EU High-Level Working Group Publishes Self Assessment for Trustworthy AI

On April 21, 2020, the Regulation on the Requirements and Reimbursement Process for Digital Health Applications (Digitale Gesundheitsanwendungen-Verordnung or „DiGAV“, available here) entered into force in Germany.  Among other provisions, the DiGAV includes specific IT security and privacy requirements.  Shortly after the law took effect, Germany’s Federal Medicines and Medical Devices Agency (“BfArM”) also released an extensive explanatory Guidance (Leitfaden, available here) to the DiGAV.

Independently, on April 15, 2020, the German Federal Office for IT Security (“BSI”) published a draft version of its guidance on “Security Requirements for Digital Health Applications” (BSI TR-03161) (available here).  The BSI is now seeking feedback from industry on this draft guidance before releasing a final version.

While the scope of application of the DiGAV and the BSI draft guidance may be limited, the documents can serve to provide useful insights and benchmarks for health applications generally.Continue Reading German Federal Agencies Publish Privacy and IT Security Requirements for Digital Health Applications

On July 24, 2019, the European Parliament published a study entitled “Blockchain and the General Data Protection Regulation: Can distributed ledgers be squared with European data protection law?”  The study explores the tension between blockchain technology and compliance with the General Data Protection Regulation (the “GDPR”), the EU’s data protection law.  The study also explores how blockchain technology can be used as a tool to assist with GDPR compliance.  Finally, it recommends the adoption of certain policies to address the tension between blockchain and the GDPR, to ensure that “innovation is not stifled and remains responsible”.  This blog post highlights some of the key findings in the study and provides a summary of the recommended policy options.
Continue Reading European Parliament Publishes Study on Blockchain and the GDPR

On June 28, 2019, the French Supervisory Authority (CNIL) announced that it will issue new guidelines on the use of cookies for direct marketing purposes.  It will issue these guidelines in two phases.

First, during July 2019, the CNIL will update its guidance issued in 2013 on cookies.  According to the CNIL, the 2013 guidance

On November 9, 2018, the French Supervisory Authority for Data Protection (known as the “CNIL”) announced that it issued a formal warning (available here) ordering the company Vectaury to change its consent experience for customers and purge all data collected on the basis of invalid consent previously obtained.

Vectaury is an advertising network

On November 6, 2018, the French data protection authority (the “CNIL”) published a report that discusses some of the questions raised by the use of blockchain technology and perceived tensions between it and foundational principles found in the General Data Protection Regulation (the “GDPR”).  As we noted in an earlier blog post on this topic, some pundits have claimed that certain features of blockchain technology, such as its reliance upon a de-centralised network and an immutable ledger, pose GDPR compliance challenges.  The CNIL has attempted to address some of these concerns, at least in a tentative manner, and further guidance from EU privacy regulators can be expected in due course.

De-centralised network

The CNIL acknowledges that EU data protection principles have been designed “in a world in which data management is centralised,” and where there is a clear controller of the data (“data controller”) and defined third parties who merely process the data (“data processors”).  Applying these concepts to a de-centralised network such as blockchain, where there are a multitude of actors, leads to a “more complex definition of their role.”  In brief, EU data privacy rules are the square peg to blockchain’s round hole.

Notwithstanding this, the CNIL considers that participants on a blockchain network, who have the ability to write on the chain and send data to be validated on the network, must be considered data controllers.  This is the case, for instance, where the participant is registering personal data on the blockchain and it is related to a professional or commercial activity.  By contrast, according to the CNIL, the miners, who validate the transactions on the blockchain network, can in certain cases be acting as data processors.  As a consequence, data processing agreements would need to be in place between the data controllers and the data processors on any blockchain network.

The CNIL further considers that where there are multiple participants who decide to carry out processing activities via a blockchain network, they will most likely be considered “joint controllers,” unless they identify and designate their roles and responsibilities in advance.   Individuals who use the blockchain for personal use (i.e., individuals who access the network to buy and sell a virtual currency), however, would not be data controllers as they can rely on the “purely personal or household activity” exception.  
Continue Reading The CNIL Publishes Report On Blockchain and the GDPR

On October 23, 2018, the European Federation of Pharmaceutical Industries in cooperation with the Future of Privacy Forum and the Center for Information Policy Leadership will organize a workshop entitled, “Can GDPR Work for Health Research.”  In the first session, the workshop will discuss the implications of the General Data Protection Regulation (“GDPR”) on clinical

The European Commission has today published its Report on the first annual review of the EU-U.S. Privacy Shield (the Report is accompanied with a Staff Working Document, Infographic, and Q&A).  The Commission concludes that Privacy Shield continues to ensure an adequate level of protection for personal data transferred from the EU to Privacy Shield-certified companies in the United States.  With its conclusion, the Commission also makes a number of recommendations to further improve the Privacy Shield framework.  The Report follows a joint press statement by the U.S. Secretary of Commerce and EU Commissioner Jourová on September 21, 2017, closing the review and reaffirming that the “United States and the European Union share an interest in the [Privacy Shield] Framework’s success and remain committed to continued collaboration to ensure it functions as intended.”

Background

The EU-U.S. Privacy Shield is a framework that effects the lawful transfer of personal data from the EEA to Privacy Shield-certified companies in the U.S.  The Privacy Shield framework was unveiled by the EU and United States on July 12, 2016 and the Privacy Shield framework became operational on August 1, 2016.  To date, there are over 2,400 in companies (including more than 100 EU-based companies) that have certified, with 400 applications under review.

The Privacy Shield provides an annual review and evaluation procedure intended to regularly verify that the findings of the Commission’s adequacy decision are still factually and legally justified.  Under the Privacy Shield, an “Annual Joint Review” is conducted by the U.S. Department of Commerce and the European Commission, with participation by the FTC, EU data protection authorities and representatives of the Article 29 Working Party, and “other departments and agencies involved in the implementation of the Privacy Shield,” including the U.S. Intelligence Community and the Privacy Shield Ombudsperson for matters pertaining to national security.  In preparation for the Review, the Commission also sought feedback from a number of trade associations, NGOs, and certified companies.  (See our earlier posts on the purpose of the first annual review here and here.)
Continue Reading EU Commission Concludes Privacy Shield “Adequate” in first Annual Review

On October 3, 2017, the Irish High Court referred Data Protection Commissioner v Facebook Ireland Limited [2016 No. 4809 P.] to the Court of Justice of the European Union (“CJEU”).  The case, commonly referred to as Schrems II, is based on a complaint by Max Schrems concerning the transfer of personal data by Facebook, from Ireland to the United States, using the EU Standard Contract Clauses (“SCCs”).

Background

The SCCs are a European Commission-approved mechanism to legally effect the transfer of personal data from the EEA to third (non-EEA) countries.  The SCCs provide for a contractual arrangement between a EEA-based data exporter and a non-EEA-based data importer of personal data, under which the data importer agrees to abide by EU privacy standards.
Continue Reading Validity of EU Standard Contractual Clauses Referred to CJEU