German Supervisory Authorities Issue Guidance on Data Subject Rights

Guidance on how to identify data subjects

On July 1, 2019, the Bavarian Supervisory Authority for the public sector (“SA”) published guidance on how to verify the identity of data subjects exercising their data protection rights under the GDPR. The guidance is directed at public bodies, but is also helpful for private entities.

According to the guidance, the controller may only request the provision of additional information if it has “reasonable doubts” about the data subject’s identity. For example, if the data subject asks the controller to contact him/her using contact details other than those used previously, or if the form or wording of the request appears unusual, then the controller may request additional information.

In these cases, the controller should use “reasonable measures” to verify the identity of the data subject (Recital 64 GDPR). According to the SA, the measures to be used will depend on the nature of the data processed. In line with the principle of data minimization, the controller should consider the following two factors: (i) the information that it requires to identify data subjects, and (ii) the risks associated with providing that information to the wrong person.

For example, for “special categories of data” (Art. 9 GDPR), the controller should take additional precautions to ensure the identity of the data subject than for more “common” personal data, because the risks associated with providing the information to the wrong person in the first case is higher than in the second.

The guidance provides the following examples of measures that can help the controller verify the data subject’s identity:

  • checking the contact details given by the requesting data subject and matching them with any existing contact details already available;
  • if the data subject uses a new email address to send his/her request for access, asking the data subject to confirm the request via the email address previously used;
  • when the data subject and controller are parties to a long-term contractual relationship, asking the data subject to provide information generated in the course of the contractual relationship which is known to both;
  • if the information is highly sensitive, asking the data subject to visit the controller’s office and produce an identity card (a copy of the identity card should only be taken in exceptional cases and in compliance with Section 20(2) of the Identity Card Act (Personalausweisgesetz) and Sec. 18(2) of the Passport Act (Passgesetz)).

According to the SA, once the additional information is no longer needed to identify an individual, it should be deleted.

If a request for access is declined because the controller cannot ascertain the identity of the data subject, the controller must document its identification efforts.


Guidance on how to interpret the right of access

In its most recent annual report, the Hessian Supervisory Authority (“Hessian SA”) commented on the scope of the right of access under Art. 15 GDPR (Link to the report, cf. page 75 et seq.).

According to the Hessian SA, the controller must always provide the data subject a copy of the personal data, even if the data subject does not explicitly request a copy. Additionally, in principle, the controller must provide an explanation of the contents of the copies.

According to the Hessian SA, data subjects do not have the right to obtain a “copy” in the literal sense of a “photocopy” or “data set”. In GDPR terms, “copy” has the meaning of “a summary of the personal data structured in a meaningful way”. For example, where a company uses a human resources information system, providing access may consist of providing an excerpt of the profile of the data subject. Where a company uses a document management or registration system, providing access may consist of listing the stored documents or file numbers relating to the data subject.

Generally, a data subject cannot ask for copies of all documents concerning him or her, but he/she may be entitled to receive copies of individual documents or email correspondence in certain situations. This is the case, for example, where it is absolutely necessary to enable the data subject to check the legality of the personal data processing. If the controller processes large amounts of data about the data subject, it can ask the data subject to be more specific about the data he/she wants to access.

The SA reminds controllers that the right of access has exceptions both under Art. 15(4) GDPR and under the German law implementing the GDPR (cf. sec. 27(2), 28(2), 29(1), 2nd sentence and 34 German Federal Data Protection Act (BDSG)).

Under the GDPR, controllers can refuse to comply with or charge a reasonable fee for requests that are excessive. The controller carries the burden of proof.

If a controller is of the opinion that it has reasonable grounds not to provide access, it must inform the data subject. This information must include the reason for the refusal so that the data subject can verify and/or challenge the controller’s interpretation of the law.

Dark Patterns: What They Are and What You Should Know About Them

You may have heard the phrase “dark patterns” as shorthand for various user interfaces designed to influence users’ decisions. They can range from the perfectly innocent to the unethical, and even illegal. Whatever the form, dark patterns have recently drawn attention from the mainstream press.

Dark patterns are coming out from the shadows. And when that happens, class action lawyers can’t be far behind.

Continue Reading

ICO Updates Guidance on Cookies and Similar Technologies

Back in 2013, we published a blog post entitled, “European Regulators and the Eternal Cookie Debate” about what constitutes “consent” for purposes of complying with the EU’s cookie rules.  The debate continues…  Yesterday, the ICO published new guidance on the use of cookies and a related “myth-busting” blog post.  Some of the “new” guidance really just repeats existing guidance, but other aspects may require organizations to review their current practices.  We summarize key points below, including in relation to when sites need to obtain consent, how to obtain consent, and when the rules apply to non-EU sites.

This all comes hot on the heels of the ICO updating its own mechanism for obtaining consent to cookies on its website last week (we set out the mechanism below).  The updated ICO guidance also follows the CNIL’s recent statement that it will issue new guidelines on cookies in two phases in the next 6 months: an update over the summer to amend its current guidance and rule out the use of implied consent to place cookies on users’ devices; and a consultation at the end of the year followed by new guidelines on how to obtain consent for the use of cookies (see our summary here).  It seems likely that some or all of this national guidance may have to be revised yet again when the proposed ePrivacy Regulation finally is agreed, although discussions on the proposal continue with no end currently in sight.

Summary of key points

To recap, under current law, consent is almost always required unless cookies are “strictly necessary,” i.e., essential (as opposed to reasonably necessary) to provide the service requested by the user.

  • “Strictly necessary” is considered from the user’s perspectiveMirroring prior guidance at EU level, the ICO repeats that cookies “that are simply helpful or convenient, but not essential ─ or that are only essential for your own purposes ─ will still require consent.”  The guidance provides examples of activities that are likely to meet the “strictly necessary” exemption as well as examples that are not likely to meet it and thus trigger the need to obtain consent:

The guidance goes on to provide more detailed information on what types of cookies are likely to be exempt from the consent requirement, including first-party session cookies for authentication (but not persistent login cookies), session cookies for load balancing, and first-party cookies for some security purposes.  This more detailed guidance will be of interest to clients in specific industries, including fraud prevention services that rely on device fingerprinting techniques.

  • Cookies used for online ads or web analytics require consent.  The ICO describes cookies used for the purposes of online advertising or web analytics as non-essential and thus require prior consent to the GDPR standard.  Readers should note that this includes first-party cookies (the guidance clearly states, “Consent is necessary for first-party analytics cookies, even though they might not appear to be as intrusive as others that might track a user across multiple sites or devices.”).  Mirroring prior ICO guidance, it goes on to suggest that enforcement in relation to first-party guidance is unlikely to be a priority.

Somewhat controversially in the context of the long-running debate over adtech and online business models, the ICO states the following as a “fact” in its myth-busting blog: “While we recognise that analytics can provide you with useful information, they are not part of the functionality that the user requests when they use your online service – for example, if you didn’t have analytics running, the user could still be able to access your service. This is why analytics cookies aren’t strictly necessary and so require consent.”

  • Online advertising cookies require consent.  To quote the guidance in full, this includes “all third-party cookies used in online advertising, including for purposes such as frequency capping, ad affiliation, click fraud detection, market research, product improvement, debugging and any other purpose.”
  •  Social media plugins sometimes require consent ─ it depends on the user and what the plugins are used for.  This bit of the guidance is more nuanced.  In summary, consent is required:
    • to set cookies in connection with social media plugins for non-logged in users of that social media platform, i.e., users who have logged out or users that are not members of that network;
    • for plugins or other technology that tracks users (members or non-members of the network) for other purposes such as advertising, behavioural monitoring, or analytics; and
    • for any use of web beacons, tracking pixels, JavaScript code or similar technologies from a social media platform or any other third party.

Consent is not required, however, if a user of that network is logged into that network when using your service and the plugins are used to interact with the network.

  • Implied consent is not valid.  Unsurprisingly, the guidance and ICO blog make clear that, because the GDPR standard of consent is much higher than under previous legislation, implied consent is no longer acceptable in relation to non-essential cookies.  This is consistent with the recent Advocate General opinion in the Planet49 case ─ see our blog here.  Accordingly, for non-essential cookies, users must take a clear and positive action to consent; pre-ticked boxes or sliders defaulted to “on” cannot be used.
  • So how to obtain consent?  The guidance explores different ways to obtain consent, including via message boxes such as banners and pop-ups.  It warns that consent would be invalid if (i) message boxes are hard to read or interact with when using a mobile device, or (ii) users do not click on any of the options available and go straight through to another part of your site without engaging with the consent box.  The guidance also states that wording such as “By continuing to use our website, you consent to our use of cookies” followed by an “OK” or “Accept” button does not result in a valid consent (because the website has decided non-essential cookies will be set and only seeks the user’s agreement afterwards with an option to continue rather than a genuine free choice).  Similarly, a consent mechanism that emphasizes “agree” or “allow” over “reject” or “block” represents a non-compliant approach as the site is influencing users towards the “accept” option.  A consent mechanism that doesn’t allow a user to make a choice would also be deemed to be non-compliant, even where the controls are located in a “more information” section.
  • Timing.  The timing of obtaining consent and collecting cookies has been an issue (at least in practice) for many years.  The ICO states that non-essential cookies must not be set on landing pages before a site obtains the user’s consent.  This is consistent with EU guidance from 2013.
  • Consent to cookie walls is unlikely to be valid ─ but let’s talk.  Cookie walls require website users to consent to the placing of tracking cookies or similar technologies before allowing them access to the website.  The ICO states that consent to cookie walls is unlikely to be valid.  This is broadly consistent with guidance and decisions of the Dutch and Austrian Supervisory Authorities in recent months (see our posts here and here).  The gist is that consent obtained in this way is not “freely given” (as required under GDPR) because withholding consent has negative consequences for the user (i.e., the user is barred from accessing the website).  Instead, websites should offer users a real choice to accept or reject cookies and be provided with an alternative method to access, e.g., payment.  Deploying perhaps characteristic British understatement, the ICO recognizes that there are “some differing opinions as well as practical considerations around the use of partial cookie walls” and intends to seek further submissions and opinions on this issue from interested parties.
  • Consent for cookies under ePrivacy means consent for processing under GDPR.  The overlap and relationship between the GDPR (that governs processing of personal data) and the ePrivacy rules (that set out requirements on cookies) has prompted several compliance challenges, not helped by the delay in updating the ePrivacy rules.  A common issue has been whether an organization may rely on one of the legal bases for processing data under the GDPR other than consent (such as legitimate interests) when that data is acquired as a result of dropping a cookie (for which consent is required).  The ICO guidance, consistent with recent statements and positions of other regulators (including the recent EDPB opinion on the interplay between the two sets of rules), suggests the answer is “no.”  For example, the guidance states: “if you have obtained consent in compliance with PECR [the UK implementation of the current ePrivacy rules], then in practice consent is also the most appropriate lawful basis under the GDPR. Trying to apply another lawful basis such as legitimate interests when you already have GDPR-compliant consent would be an entirely unnecessary exercise, and would cause confusion for your users.”
  • What about sites outside of the EU?  One feature of the current ePrivacy rules that has caused some head-scratching over the years is that, unlike the former Data Protection Directive 95/46/EC or the GDPR, they don’t contain an express applicable law test.  The guidance states (eventually  ─ it’s on page 44) that the territorial rules under the GDPR apply when cookies involve processing personal data.  The upshot is that just because a site is available to users in the EEA the rules do not automatically apply.  Instead, a site would have to offer goods or services to EEA users (e.g., an ecommerce site that allows users to purchase products from anywhere in the world and offers prices in local currency) or monitor their behaviour.  The guidance states that whether the rules would apply to an online news outlet based outside the EEA but accessible to individuals within the EEA “may not be in scope of the GDPR, depending on its circumstances” (e.g., is the content directed at individuals within the outlet’s own country rather than individuals in the EEA?  has it taken measures to prevent EEA users from accessing the site?  etc.)

In addition to the above points, the updated document provides guidance on how to comply with the rules, including recommendations on how to conduct a cookie audit and how to keep records of user preferences.

ICO mechanism

Finally, some readers may be interested to see the steps that the ICO took last week to update its own mechanism for providing information and collecting consent.  This now involves a cookie side-banner that includes “off” by default language for third party (Google) analytics cookies in conjunction with the ICO’s cookie policy and a permanent bottom corner “C” icon on the site that provides access to cookie controls.

German Bundestag approves 2nd German Data Protection Adaptation Act (“2nd DSAnpUG”): Summary of significant changes for German data protection laws.

On 28 June 2019, the German Bundestag passed the 2nd DSAnpUG which will amongst other things further adapt the German Federal Data Protection Act („BDSG“), the German Federal Registration Act (“BMG”), the German Act on the Federal Office for Security in Information Technology (“BSI-Act”) and the Act on the Establishment of a Federal Institute for Digital Radio of Authorities and Organizations with Security Responsibilities (“BDBOS-Act”) to the provisions of the General Data Protection Regulation („GDPR“). The following post shall introduce the most important changes in said specific laws:

Of particular practical importance is the amendment of § 38 (1) BDSG: The limit from when data controllers and data processors shall appoint a data protection officer increased from 10 persons to 20 persons permanently engaged in the automated processing of personal data.

In the context of data processing in an employment relation, the current version of § 26 BDSG provides for a specific legal basis that inter alia requires that consent granted from employees must be provided in written form. The 2nd DSAnpUG amends such obligation and allows also for electronic provision of the consent.

Also, § 22 BDSG which deals with the processing of special categories of personal data will be amended and now introduces a further provision of permission for the processing of special categories of personal data. In future, non-public bodies will also be allowed to process special categories of personal data if this is “absolutely necessary for reasons of substantial public interest“, a criterion that is subject to interpretation by the competent courts.

The 2nd DSAnpUG also includes a new legal basis in § 86 BDSG for processing of personal data for the purposes of state awards and honours. According to § 86 BDSG processing of personal data including special categories of personal data shall be permitted – without knowledge of the affected data subject – for both public and non-public bodies in order to prepare and implement state procedures for awards and honours. The regulation allows for other exceptions, such as an exemption from the duty to provide information about the modalities of data processing according to Article 13 GDPR.

Apart from that, the currently applicable § 9 BDSG that governs the Federal Commissioner for Data Protection and Freedom of Information’s competences, shall be revised. In future, companies providing commercial telecommunications services will be subject to uniform supervision by the Federal Commissioner for Data Protection and Freedom of Information as far as they process data from natural or legal persons for the professional provision of telecommunication services and such obligation does not already arise from § 115 (4) of the German Telecommunications Act („TKG“).

The BMG shall be amended in such a way that address traders cannot use data from a civil register information for the purposes of advertising or address trading even with the consent of the data subject concerned. The current version of § 44 (3) BMG explicitly provides that data subjects concerned may grant their consent in such data processing.

Further, the BSI-Act shall be amended in a way that will limit certain data subject rights in the context of specific data processing activities. Amongst other things, the obligation to inform data subjects pursuant to Art. 13 GDPR shall not apply if the provision of information would jeopardize the proper fulfilment of the obligations falling within the competence of the Federal Office for Security in Information Technology. Limitations of data subject rights also include certain limitations of the right to object, the right to rectification as well as the right of access to personal data by the data subject.

Finally, a new § 19 (4) will be included in the BDBOS-Act: Processing of traffic data in the context of digital radio (the guiding principle of digital radio is a uniform and powerful radio network for all authorities and organizations with security tasks in Germany) receives its own legal basis and the Federal Institute for Digital Radio of Authorities and Organizations with Security Responsibilities may store that traffic data up to 75 days.

In total, the 2nd DSAnpUG makes changes in 154 specialized German laws. Mostly, changes include the adaptation of definitions and legal bases for data processing as well as regulations on the rights of data subjects. The 2nd DSAnpUG is subject to approval by the German Federal Council which is expected in the near future and will enter into force on the day following its promulgation in the Federal Law Gazette.

Two new developments from the EU High-Level Working Group on AI: launch of pilot phase of Ethics Guidelines and publication of Policy and Investment Recommendations for Trustworthy AI

On June 26, 2019, the EU High-Level Expert Group on Artificial Intelligence (AI HLEG) announced two important developments: (1) the launch of the pilot phase of the assessment list in its Ethics Guidelines for Trustworthy AI (the “Ethics Guidelines”); and (2) the publication of its Policy and Investment Recommendations for Trustworthy AI (the “Recommendations”).

The AI HLEG is an independent expert group established by the European Commission in June 2018.  The Recommendations are the second deliverable of the AI HLEG; the first was the Group’s Ethics Guidelines of April 2019, which defined the contours of “Trustworthy AI” (see our previous blog post here).  The Recommendations are addressed to policymakers and call for 33 actions to ensure the EU, together with its Member States, enable, develop, and build “Trustworthy AI” – that is, AI systems and technologies that reflect the AI HLEG’s now-established ethics guidelines.  Neither the Ethics Guidelines nor the Recommendations are binding, but together they provide significant insight into how the EU or Member States might regulate AI in the future.

Throughout the remainder of 2019, the AI HLEG will undertake a number of sectoral analyses of “enabling AI ecosystems” — i.e., networks of companies, research institutions and policymakers — to identify the concrete actions that will be most impactful in those sectors where AI can play a strategic role.

Continue Reading

French Supervisory Authority will issue new guidelines on cookies

On June 28, 2019, the French Supervisory Authority (CNIL) announced that it will issue new guidelines on the use of cookies for direct marketing purposes.  It will issue these guidelines in two phases.

First, during July 2019, the CNIL will update its guidance issued in 2013 on cookies.  According to the CNIL, the 2013 guidance is outdated because it refers to implied consent (i.e., consent through the continued use of the website) as an acceptable mode of obtaining consent to place cookies.  The new guidance will rule out the use of implied consent to place cookies on users’ devices.  However, the CNIL will not enforce the new rules for a period of twelve months.

In the second phase, the CNIL will consult with relevant stakeholders in order to create new guidelines on how to obtain consent for the use of cookies.  The CNIL intends to publish draft guidelines in December 2019 or January 2020 for public consultation.  Once the new guidelines are adopted, the CNIL will grant companies a 6-month transition period to implement the new guidelines.

According to the CNIL, it can no longer wait for the approval of the ePrivacy Regulation, which is not expected to be finalized in the short term.




Maine Enacts Broadband Privacy Law

Earlier this month, Maine’s legislature enacted a new statute granting broad privacy rights to internet users in the state. Hailed as “the strictest consumer privacy protections in the nation,” the statute places among the toughest burdens on regulated entities to protect the data of their consumers.

The statute applies only to broadband internet service providers (ISPs), defined as any “mass-market retail service by wire or radio that provides the capability to transmit data to and receive data from all or substantially all Internet endpoints.” According to the sponsor of the original bill, state Senator Shenna Bellows, the statute is intended to target companies with mass amounts of consumer data, such as Verizon and Xfinity. It excludes large technology companies such as Google and Facebook, which are still avoidable by consumers if they choose to do so. Sen. Bellows noted that the prioritization of ISPs was due to the fact that, “you can use the internet without using Facebook, [but y]ou can’t use the internet without using your internet service provider.” She has stated that she does intend to introduce more general privacy legislation in the future. Continue Reading

ICO’s Call for Input on Bias and Discrimination in AI systems

On June 25, 2019, as part of their continuing work on the AI Auditing Framework, the UK Information Commissioner’s Office (ICO) published a blog setting out their views on human bias and discrimination in AI systems. The ICO has also called for input on specific questions relating to human bias and discrimination, set out below.

The ICO explains in its blog how flaws in training data can result in algorithms that perpetuate or magnify unfair biases. The ICO identifies three broad approaches to mitigate this risk in machine learning models:

  1. Anti-classification: making sure that algorithms do not make judgments based on protected characteristics such as sex, race or age, or on proxies for protected characteristics (e.g., occupation or post code);
  2. Outcome and error parity: comparing how the model treats different groups. Outcome parity means all groups should have equal numbers of positive and negative outcomes. Error parity means all groups should have equal numbers of errors (such as false positives or negatives). A model is fair if it achieves outcome parity and error parity across members of different protected groups.
  3. Equal calibration: comparing the model’s estimate of the likelihood of an event and the actual frequency of said event for different groups. A model is fair if it is equally calibrated between members of different protected groups.

The guidance stresses the importance of appropriate governance measures to manage the risks of discrimination in AI systems. Organizations may take different approaches depending on the purpose of the algorithm, but they should document the approach adopted from start to finish. The ICO also recommends that organizations adopt clear, effective policies and practices for collecting representative training data to reduce discrimination risk; that organizations’ governing bodies should be involved in approving anti-discrimination approaches; and that organizations continually monitor algorithms by testing them regularly to identify unfair biases. Organizations should also consider using a diverse team when implementing AI systems, which can provide additional perspectives that may help to spot areas of potential discrimination.

The ICO seeks input from industry stakeholders on two questions:

  • If your organisation is already applying measures to detect and prevent discrimination in AI, what measures are you using or have you considered using?
  • In some cases, if an organisation wishes to test the performance of their ML model on different protected groups, it may need access to test data containing labels for protected characteristics. In these cases, what are the best practices for balancing non-discrimination and privacy requirements?

The ICO also continues to seek input from industry on the development of an auditing framework for AI; organizations should contact the ICO if they wish to provide feedback.

UK Government’s Guide to Using AI in the Public Sector

On June 10, 2019, the UK Government’s Digital Service and the Office for Artificial Intelligence released guidance on using artificial intelligence in the public sector (the “Guidance”).  The Guidance aims to provide practical guidance for public sector organizations when they implement artificial intelligence (AI) solutions.

The Guidance will be of interest to companies that provide AI solutions to UK public sector organizations, as it will influence what kinds of AI projects public sector organizations will be interested in pursuing, and the processes that they will go through to implement AI systems.  Because the UK’s National Health Service (NHS) is a public sector organization, this Guidance is also likely to be relevant to digital health service providers that are seeking to provide AI technologies to NHS organizations.

The Guidance consists of three sections: (1) understanding AI; (2) assessing, planning and managing AI; (3) using AI ethically and safely, as summarized below. The guidance also has links to summaries of examples where AI systems have been used in the public sector and elsewhere.

Continue Reading

Privacy Shield Ombudsperson Confirmed by the Senate

On June 20, 2019, Keith Krach was confirmed by the U.S. Senate to become the Trump administration’s first permanent Privacy Shield Ombudsperson at the State Department.  The role of the Privacy Shield Ombudsperson is to act as an additional redress avenue for all EU data subjects whose data is transferred from the EU or Switzerland to the U.S. under the EU-U.S. and the Swiss-U.S. Privacy Shield Framework, respectively.

As Ombudsperson, Krach will be responsible for dealing with complaints and requests from individuals in the EU and Switzerland, including in relation to U.S. national security access to data transmitted from the EU or Switzerland to the U.S.  The Ombudsperson works with other Government officials and independent oversight bodies to review and respond to requests.  Krach’s role as Ombudsperson forms part of his duties as the Under Secretary for Economic Growth, Energy and the Environment.  The Under Secretary is independent from the intelligence services and reports directly to the Secretary of State.

The formal approval of a permanent Privacy Shield Ombudsperson will be welcomed at EU level.  As we have previously reported, the European Data Protection Board praised the appointment of a permanent Ombudsperson in its January report regarding the second annual review of the Privacy Shield.  In addition, the Commission has emphasized that the Ombudsperson is “an important mechanism that ensures complaints concerning access to personal data by U.S. authorities are addressed.”  This appointment comes at a time when both the EU-U.S. Privacy Shield and the Standard Contractual Clauses are under scrutiny in the European courts.