China’s Ministry of Public Security Issues New Personal Information Protection Guideline

On April 19, 2019, China’s Ministry of Public Security (“MPS”) released the final version of its Guideline for Internet Personal Information Security Protection (互联网个人信息安全保护指南) (the “Guideline”).  A previous version of the Guideline was released for public comments on November 30, 2018.

Under China’s Cybersecurity Law (the “CSL”), MPS is the key regulator tasked with protecting cybersecurity and combating cybercrime.  Following the issuance of the draft Regulations on Cybersecurity Multi-level Protection Scheme (the “Draft MLPS Regulation”, discussed in our previous post available here) and the Regulation on the Internet Security Supervision and Inspection by Public Security Agencies (also discussed in a previous post, available here) last year, the release of this new Guideline represents the latest efforts made by MPS to implement the CSL.

The stated goal of the Guideline is to “protect cybersecurity and individuals’ legitimate interests” and to “effectively prevent cybercrime involving personal information.”  Although not issued as a legally binding administrative regulation, this Guideline sets out the best practices recommended by MPS and will likely serve as an important reference for cybersecurity inspections that will be carried out by the agency and its local counterparts (i.e., local public security bureaus, “PSBs”).

To a large extent, this Guideline overlaps with China’s national standard on personal information protection, GB/T 35273-2017 Information Security Technology – Personal Information Security Specification (the “Standard”), which took effect on May 1, 2018.  The Guideline referred to the Standard as its “indispensable” reference, although at this stage, it is unclear how this Guideline will interact with other existing regulations and national standards.  Furthermore, this new Guideline provides more prescriptive requirements relating to a company’s cybersecurity infrastructure, both in terms of organizational support and technical measures to be implemented.

This post summarizes key requirements of the Guideline.

Continue Reading

EDPB Begins Consultation on New Guidelines on Use of the “Performance of a Contract” GDPR Legal Basis by Online Services

On 9 April 2019, the European Data Protection Board (“EDPB”) adopted new guidelines “on the processing of personal data under Article 6(1)(b) GDPR in the context of the provision of online services to data subjects.”

In general, the GDPR requires that processing of personal data be justified under a legal basis in Article 6 GDPR.  One such legal basis is Article 6(1)(b), which covers data processing that is “necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract.”  The new EDPB guidelines consider the meaning of this basis, and in particular whether it can be used as the basis for data processing by online services for purposes such as service improvement, fraud prevention, targeted advertising, and service personalization.

In particular, the guidelines clarify the EDPB’s view that:

  • Targeted advertising, even when it “supports” an online service by funding that service, is “separate from the objective purpose of the contract between the user and the service provider,” and therefore is “not necessary for the performance of the contract at issue.”
  • Service improvement through the collection of usage information, telemetry, and user engagement data, “in most cases…cannot” be regarded as within the scope of Article 6(1)(b).
  • Service personalization can potentially fall within the scope of Article 6(1)(b), where that personalization is an “essential or expected” part of the service.
  • Fraud prevention generally cannot fall within Article 6(1)(b).

In addition, the EDPB touches on a range of other points in relation to interpretation of Article 6(1)(b) under the GDPR.  Perhaps most notably:

  • The EDPB argues the term “necessary,” as used in Article 6(1)(b), must be interpreted in line with data protection law objectives.  Accordingly, the EDPB takes the view that processing which is “useful but not objectively necessary for performing the contractual service or for taking relevant pre-contractual steps at the request of the data subject, even if it is necessary for the controller’s other business purposes,” is not “necessary.”
  • The EDPB repeatedly emphasizes that Article 6(1)(b) can only cover processing purposes that are “clearly and specifically identified.”  The EDPB goes on to explain that purposes such as “improving users’ experience,” “marketing purposes,” “IT-security,” and “future research,” are all insufficiently specific.
  • The EDPB does allow, however, that Article 6(1)(b) can apply to incidental data processing related to the performance of a contract where processing can be “reasonably foreseen and necessary within a normal contractual relationship.”  This includes, for instance, processing necessary to send reminders to data subjects about outstanding payments, processing related to warranties, or processing needed to bring “a contract back in conformity after smaller incidents and issues.”
  • The EDPB clarifies that, for the purpose of Article 6(1)(b), contracts do not need to be expressed to be governed by the laws of EEA Member States.
  • The EDPB is clear that, in general, when contracts terminate, controllers should stop processing data previously processed based on Article 6(1)(b) relating to that contract.  The EDPB further states that additional further processing of such data after contract termination could be “unfair” except where based on consent of the data subject (or where required under applicable EU or Member State law).

The guidelines are now open for consultation until May 25, 2019.

European Commission Issues Updated Q&A on Interplay between the GDPR and the Clinical Trials Regulation

On April 10, 2019, European Commission Directorate-General for Health and Food Safety issued a revised Q&A analyzing the interplay between the EU Clinical Trials Regulation (“CTR”) and the  EU General Data Protection Regulation (“GDPR”).  The revised Q&A takes into account the opinion of the European Data Protection Board (“EDPB”) issued on January 23, 2019, on the same topic (which we discuss in our blog post here).  Below, we summarize the main takeaways of the Commission’s updated Q&A.

Legal basis for processing health data

Helpfully, the Q&A addresses the appropriate legal basis under the GDPR for the processing of clinical trial data, an issue which Member States appear to be adopting divergent approaches to in recent months.  The Q&A, like the EDPB opinion, distinguishes between two different processing purposes associated with clinical trials and attributes different legal bases to each:

  1. processing for patient safety purposes, such as safety reporting, archiving and inspections, which is required by the CTR (and thus can be based on Articles 6(1)(c) and 9(2)(i) of the GDPR), and for which no consent is required.
  2. processing for scientific research purposes, which “cannot be derived from a legal obligation,” such as one arising under the CTR. In this case, data controllers may consider a number of different legal bases, depending on the nature of the clinical trial.  The Commission notes that the processing can potentially serve a public interest, be based on a legitimate interest or be based on participant consent (each time in combination with a legal basis in Article 9 when special data, such as health or genetic data, are processed).

While generally helpful, the Commission noticeably refrains from endorsing any particular legal basis when processing data for scientific research purposes, leaving it up to the sponsor and research institutions to decide.  The Q&A also fails to highlight that, with the exception of consent, the remaining legal bases under Article 9 of the GDPR mentioned in the Q&A must be grounded in Union or Member State law (with the CTR apparently excluded as a possibility – see (2.) above).  In practice, consent is likely to be the only available option in many cases, owing to an absence of such laws.

As regards consent, the Commission’s Q&A provides that a trial subject’s consent to participate in a trial must be distinguished from consent to the processing his or her personal data, a theme that also appears in the EDPB guidance.  Thus, a trial participant could, in theory, withdraw consent to the former, but not the latter.  However, if the processing of data is based on a trial subject’s consent and he or she later withdraws that consent, the controller is expected to stop processing the data and delete it, unless it has another legal basis to continue processing the data (e.g., for safety purposes).  Curiously, the Q&A fails to discuss the GDPR’s scientific research exemption to the deletion right under Article 17(3)(d) – i.e., the right to erasure does not apply if the data are used for scientific research and complying with the erasure request would render impossible or seriously impair the research aims.

Further use of research data

In relation to further use of clinical trial data, the Commission Q&A appears to acknowledge that the CTR’s limitations on further use of such data (requiring consent for data used outside the scope of the trial protocol – see here) are waived where one of the alternative legal bases in the GDPR applies.  In short, consent would not appear to serve as the sole legal basis for the further use of clinical trial data.

Further, the Q&A highlights the fact that secondary use of clinical trial data for scientific research purposes is by default compatible with its original use, in accordance with Article 5(1)(b) of the GDPR.  As a result, it should not be necessary to obtain a new consent in order to engage in additional secondary research.  In the event that the secondary research is nevertheless based on consent, the Q&A repeats the EDPB’s cautionary language about reliance upon overly broad consent (notwithstanding GDPR recital text supporting broad consent in the research context).  This restrictive interpretation of the consent doctrine, which we discuss in more detail here, limits its utility and conflicts with the GDPR’s other research-friendly provisions.

Ultimately, readers may be forgiven for being confused by references to broad consent in the GDPR, when the Commission states in the Q&A that “the obligations with regard to the requirement of specific consent still apply.”  In fact, the Q&A explains that consent for further, secondary use must be separated from the original consent, likely involving a “separate sheet” for the collection of the consent, effectively ensuring that the original consent could not be a “broad” consent.  The Commission’s suggestion, however, begs the question of why anyone would seek to rely upon consent, in light of the Commission’s earlier concession that the further use of clinical trial data for scientific research is compatible with its original use.

Miscellaneous observations

Finally, the Q&A also contains some additional notable remarks, including that research sponsors established outside the EU and performing clinical trials in the EU are subject to the GDPR, on the basis that they are “monitoring” EU data subjects (i.e., trial participants) or offering services in the EU, and that the GDPR’s transfer restrictions also apply to transfers of clinical trial data.  The Commission document also makes clear that pre-GDPR informed consent forms used in ongoing trials should be updated and furnished to trial subjects in order to meet the GDPR’s augmented transparency requirements, but leaves it open as to when obtaining fresh consent from trial subjects would be necessary.  In this respect, the Q&A does not provide any more insights than appear in the EDPB’s existing guidance.

Department of Justice Releases White Paper on CLOUD Act

On Wednesday, the U.S. Department of Justice released a white paper and FAQ on the Clarifying Lawful Overseas Use of Data (“CLOUD”) Act, which was enacted in March 2018 and creates a new framework for government access to data held by technology companies worldwide.  The paper, titled “Promoting Public Safety, Privacy, and the Rule of Law Around the World: The Purpose and Impact of the CLOUD Act,” addresses the scope and purpose of the CLOUD Act and responds to 29 frequently asked questions about the Act.

Continue Reading

Association of German Supervisory Authorities issues paper on broad consent for research

On April 3, 2019, the Association of German Supervisory Authorities (“Datenschutzkonferenz” or “DSK”) issued a paper (available here in German) on the interpretation of “broad consent” for scientific research in Recital 33 of the GDPR and the interplay with the definition of consent  and the principle of purpose limitation.

According to the DSK, broad consent should only be used in exceptional circumstances when it is not possible to establish at the outset the expected scope of the research.  Moreover, the DSK suggests that a broad consent can be fixed at a later stage of the research by narrowing down the scope of the research once that scope is clearer – i.e., deliberately not using the obtained flexibility.  The use of broad consent also does not relieve parties from their obligation to put in place mechanisms to limit the authorized use of data and to prevent the uncontrolled expansion of research use.

In those cases where broad consent is “absolutely necessary”, the DSK sets out a list of recommended safeguards.  These safeguards should compensate on three fronts for the weak nature of broad consent: ensuring heightened transparency of the processing, reinforcing the trust of the data subjects in the processing and guaranteeing the protection of the personal data. The safeguards include:

  • documenting why specific consent is not possible;
  • establishing an internet page informing data subjects on a continuous basis about the research project and future research projects involving their personal data;
  • obtaining the consent of the ethics committee for further processing for research purposes;
  • verifying if dynamic consent is an option;
  • not transferring personal data to countries that do not provide an adequate level of protection of personal data; and
  • the application of specific encryption and pseudonymization techniques.

Finally, according to the DSK, controllers should keep a record of their decision to rely on broad consent and of the safeguards they implement, and submit these documents, together with a description of the research project, to the competent bodies responsible for examining the ethical and data protection compatibility of the research project.

Comment:

The DSK opinion is concerning.  To a large extent it repeats the Article 29 Working Party’s previous guidelines on consent.  However, it demonstrates again that Supervisory Authorities find it hard to come to terms with the GDPR’s favorable provisions for scientific research.  The way in which the DSK interprets Recital 33 risks voiding it of any meaning and utility.

This reluctant attitude of the authorities is unnecessary.  Recital 33 of the GDPR can be read in a way that dovetails nicely with other provisions of the GDPR that reflect the lawmaker’s policy decision to create a scientific research-friendly framework.  In fact, allowing broad consent to be relied on for scientific research can be seen as an extension of the exception to the purpose limitation principle in Art. 5(1)(b) of the GDPR.  It is quite astonishing to observe how the Supervisory Authorities can write a dedicated paper on scientific research without making any reference to this exception.

Article 5(1)(b) of the GDPR provides that the use of personal data for scientific research is by default compatible with the original purposes for which the data was collected.  The purpose limitation principle and Art. 6(4) of the GDPR simply do not apply when personal data is used for scientific research.  Recital 50 of the GDPR provides that when processing for compatible purposes, “no legal basis separate from that which allowed the collection of personal data is required.”

Obtaining broad consent for scientific research is consistent with these provisions of the GDPR.  A broad consent reflects the fact that the individual must accept the use personal data for other scientific research at the outset (as it is compatible) – that is the baseline position discussed above.  What’s the point of obtaining a (likely incomplete) narrow consent if subsequent further use for scientific research is compatible anyway?  Somewhat provocatively, one could argue that a broad consent for scientific research is the only consent that is fair to data subjects because it informs data subjects of the lawmaker’s policy decision reflected in the GDPR – a policy decision to permit personal data to be used for scientific research, subject to suitable safeguards set out in various provisions of the GDPR.

EU High-Level Working Group Publishes Ethics Guidelines for Trustworthy AI

On April 8, 2019, the EU High-Level Expert Group on Artificial Intelligence (the “AI HLEG”) published its “Ethics Guidelines for Trustworthy AI” (the “guidance”).  This follows a stakeholder consultation on its draft guidelines published in December 2018 (the “draft guidance”) (see our previous blog post for more information on the draft guidance).  The guidance retains many of the same core elements of the draft guidance, but provides a more streamlined conceptual framework and elaborates further on some of the more nuanced aspects, such as on interaction with existing legislation and reconciling the tension between competing ethical requirements.

According to the European Commission’s Communication accompanying the guidance, the Commission will launch a piloting phase starting in June 2019 to collect more detailed feedback from stakeholders on how the guidance can be implemented, with a focus in particular on the assessment list set out in Chapter III.  The Commission plans to evaluate the workability and feasibility of the guidance by the end of 2019, and the AI HLEG will review and update the guidance in early 2020 based on the evaluation of feedback received during the piloting phase. Continue Reading

EU Commission Issues Recommendation on Cybersecurity in the Energy Sector

The European Commission (“Commission”) has published a Recommendation on cybersecurity in the energy sector (“Recommendation”).  The Recommendation builds on recent EU legislation in this area, including the NIS Directive and EU Cybersecurity Act (see our posts here and here).  It sets out guidance to achieve a higher level of cybersecurity taking into account specific characteristics of the energy sector, including the use of legacy technology and interdependent systems across borders.

Continue Reading

Reaching for the CLOUD

This article originally appeared in Global Data Review on March 29, 2019

Last year, the US passed legislation expanding the geographic reach of certain legal process, including search warrants, issued to technology providers seeking customer data. Under the Clarifying Lawful Overseas Use of Data (CLOUD) Act, warrants issued by US courts can force certain types of providers to disclose customer data stored anywhere in the world.

Notably, the CLOUD Act does not affect only US technology providers. The legislation covers all providers of defined technology services, so long as they are subject to US jurisdiction and in possession, custody or control of the data sought.  This article describes the CLOUD Act, addresses scenarios in which technology providers based outside the US may be subject to the legislation, and identifies mechanisms for challenging legal process issued under the Act.

Continue Reading

Council of Europe issues recommendation on health-related data

On March 28, 2019, the Council of Europe* issued a new Recommendation on the protection of health-related data.  The Recommendation calls on all Council of Europe member states to take steps to ensure that the principles for processing health-related data (in both the public and private sector) set out in the Appendix of the Recommendation are reflected in their law and practice.

This Recommendation is likely to be of interest to both public sector and private sector organizations that are seeking to use health-related data in innovative ways, including developing digital health solutions that involve genetic data, scientific research, data sharing or mobile health applications.

The Recommendation builds on Convention 108, which is an international treaty first ratified in 1981 and the first legally binding international instrument on protecting individuals’ privacy.  The Convention 108 has recently been updated to be aligned to the GDPR (see the text of the consolidated text of the modernized Convention 108+), but contains less granular obligations than the GDPR.  The Recommendation complements the modernized Convention 108+ by introducing specific definitions (such as “health-related data” and “genetic data”) and specific principles for processing health data.

Most of the principles on processing health data set out in the Recommendation reiterate the position under the EU General Data Protection Regulation (“GDPR”) and relevant guidance issued by European data protection authorities and the European Data Protection Board (the “EDPB”, previously known as the “Article 29 Working Party”).  The Recommendation does, however, provide some specific guidance on processing health-related data that is more detailed than, and in some aspects, goes beyond, the requirements of the GDPR, as described below:

  • Genetic data. The Recommendation provides that genetic data should only be collected subject to appropriate safeguards where it is either prescribed by law, or on the basis of consent (except where such consent is excluded by law).  Genetic data used for preventative health care, diagnosis or treatment of patients or scientific research should only be used for those purposes, or to enable the individuals concerned by the results of the genetic tests to take an informed decision on these matters.  Genetic data used in the employment context, for insurance purposes and for judicial procedures or investigations are specifically called out as areas requiring further consideration by member states on laws to provide appropriate safeguards.
  • Sharing health-related data for secondary purposes.  In relation to sharing health-related data for purposes other than providing and administering health care, the Recommendation states that only recipients who are authorized by law should have access to health-related data, with no mention of patients’ consent as a way of legitimizing such access. This position is potentially more restrictive than the current approach under the GDPR, where third parties not involved in providing health care to patients (such as research or academic institutions or commercial companies) may receive health-related data as long as they do so in compliance with the GDPR.  Whether national laws implementing this Recommendation will provide that third parties lawfully receiving health-related data in compliance with the GDPR (such as with patients’ consent) will be considered to meet this “authorization” requirement remains to be seen.  The Recommendation also states that recipients of health-related data must be subject to the rules of confidentiality incumbent upon a healthcare professional (or equivalent) unless other safeguards are provided by law.
  • Scientific research.  The Recommendation takes a contextual approach to scientific research, providing that the need to process health-related data for scientific research should be weighed against the risks to the data subject (and to their biological family if genetic data is involved). Unlike the GDPR, the Recommendation does not automatically qualify scientific research as being compatible with the original purposes for which the data was collected.  As a general principle, health-related data should only be processed for research purposes where the data subject has consented, unless the law provides that health-related data can be processed without consent.  Individuals should also be provided transparent and comprehensible information about the research project.  The Recommendation adds that conditions in which health-related data are processed for scientific research must be assessed, where necessary, by the competent independent body, such as an ethics committee, and such research projects should be subject to safeguards set out in law.  Fundamentally, the three-part requirements of consent/law, notice and safeguards for using health-related data for research is the same as under the GDPR  However, in some respects Recommendations appear to call for a strengthened regime for scientific research using health-related data that goes further than the GDPR.
  • Digital health. Several principles in the Recommendation are clearly relevant for digital health applications, particularly those involving artificial intelligence, machine learning and mobile devices.  The Recommendation provides that systems storing health-related data should be “auditable”, meaning that it should be possible to trace any access to, modification of, and actions carried out on the information system, so that the author can be identified.  The Recommendation also encourages the adoption of “reference frameworks”, which are coordinated set of rules and state-of-the-art processes adapted to practice and applicable to health information systems, covering areas of interoperability and security, which should apply to information systems hosting or processing health-related data.  The Recommendations also specifically mentions professionals who are not directly involved in providing individual patient health care, but may have access to health-related data to provide “smooth operation of information systems” (such as cloud systems?).  Such professionals must have full regard for professional secrecy and comply with security requirements laid down by law to guarantee the confidentiality and security of the data.  In relation to mobile devices, the Recommendation makes it clear that information collected on mobile devices can constitute health-related data and therefore should have the same legal protections as other health-related data processing.
  • Individuals’ rights. The Recommendation provides that individuals should have the right to be informed and exercise control over their health-related data and genetic data, in line with the GDPR.  However, three areas of deviation are: (1) individuals should have the right not to be informed of medical diagnoses or the results of genetic tests, as they may have their own reasons for not wishing to know, subject to limited exceptions where they must be informed by law; (2) when individuals withdraw from a scientific research project, individuals should be informed that their health-related data processed in the context of that research will be destroyed or anonymized in a manner which does not compromise the scientific validity of the research – which appear to be more nuanced than recent guidance form the EDPB; and (3) individuals should have the right to be informed the reasoning that underlies data processing involving health-related data where the results of such processing are applied to them, particularly if profiling is involved.  This second right is similar to the one in the GDPR (Article 15(1)(h)) but applies more broadly to include processing other than those that fall within solely automated decision-making with significant effects (as described in Article 22 of the GDPR).

To the extent that the GDPR does not already impose the same obligations as in the principles of the Recommendation, the Recommendation is not binding on any private sector or public sector organizations.  The member states of the Council of Europe or the European Union, however, are expected to use the Recommendation as guidance when adopting national laws that deal with health data.  These principles also provide some insight into how European data protection authorities are likely to interpret the provisions in the GDPR that apply to health-related data and genetic data, and the direction of future guidance and legislation on the topic.

* The Council of Europe is an international organization, which is distinct from the European Union, founded in 1949 to promote democracy and protect human rights and the rule of law in Europe.  The Council of Europe consists of 47 member states, which includes all of the 28 EU Member States.  Recommendations issued by the Council of Europe are not binding until the EU or national governments of Member States implement legislation, but EU laws often build on Council of Europe standards when drawing up legislation.

Polish Supervisory Authority issues GDPR fine for data scraping without informing individuals

On March 26, 2019, the Polish Supervisory Authority (“SA”) issued a fine of around €220,000 against a company that processed contact data obtained from publicly available sources without informing the individuals concerned (decision in Polish here and English summary here). Article 14 of the GDPR requires data controllers, who do not obtain personal data directly from the individuals concerned, to provide these individuals with information about how their data is processed within a reasonable time after obtaining the data (max. 1 month).

The company scraped contact data from public registries, such as the Polish Central Electronic Register and Information on Economic Activity, to prepare trade reports, contact lists and “to provide other business and management consulting services” to its clients. The company’s systems contained around 7,6 million records with personal data of natural persons (including sole traders and persons engaged in an economic activity).

In April 2018, the company sent an email to all the individuals of whom it possessed the email address (around 680 thousand individuals) with information about how it processes their personal data. The company also published on its website a data protection policy containing similar information. However, the company did not provide information by SMS or physical post to those individuals of whom it only had the phone number or postal address respectively (about 6,5 million individuals).

In its defense the company asserted that: (i) the data constitutes publicly available information; (ii) the processing only involved very limited data (only contact details); (iii) the risk to the rights and freedoms of the individuals was low; (iv) the company employs high security standards to protect the personal data; and (v) providing information by post to the individuals for whom it does not have an email address would have a serious impact on the company’s business. According to the company, just the cost of sending the registered mail would amount to more than €7.8 million, not considering the human resource costs and other costs (e.g., of printing, preparing for shipment and dispatch, paper, toner, envelopes, stamps, handling returns, etc.).  On this basis, the company indicated that providing the information by post would constitute a “disproportionate effort”, triggering the derogation in Article 14(5)(b) of the GDPR.

In this case, the SA decided that the mere provision of the information through a website privacy policy did not suffice  as it was not “impossible”, nor a “disproportionate effort” for the company to contact the individuals whose telephone number or postal address it had. However, the SA recognized that, where the company lacked the contact details of the individuals and would have to search this data in other sources, this would constitute a “disproportionate effort” for the company.

The company was found to have intentionally violated Article 14 GDPR motivated by a desire to avoid additional costs associated with informing the individuals about the processing of their data. In addition to the fine, the company was also ordered to inform, within 3 months of the decision, the individuals whose contact data it held.

LexBlog