On August 9, 2019, the U.S. Department of Commerce’s National Institute of Standards and Technology (“NIST”) submitted its plan for federal engagement in the development of artificial intelligence standards. The plan was developed in response to the Executive Order signed by President Trump earlier this year, which required NIST to “issue a plan for Federal engagement in the development of technical standards and related tools in support of reliable, robust, and trustworthy systems that use AI technologies.” The final plan incorporates comments from over 40 organizations that commented on a draft released in July. Continue Reading
On July 24, 2019, the European Parliament published a study entitled “Blockchain and the General Data Protection Regulation: Can distributed ledgers be squared with European data protection law?” The study explores the tension between blockchain technology and compliance with the General Data Protection Regulation (the “GDPR”), the EU’s data protection law. The study also explores how blockchain technology can be used as a tool to assist with GDPR compliance. Finally, it recommends the adoption of certain policies to address the tension between blockchain and the GDPR, to ensure that “innovation is not stifled and remains responsible”. This blog post highlights some of the key findings in the study and provides a summary of the recommended policy options.
At the Black Hat conference in Las Vegas last week, a security researcher presented his research on using access rights available under the GDPR for identity theft purposes (slides available here; whitepaper available here). Specifically, the researcher “attempted to steal as much information as possible” about his fiancé by submitting GDPR access requests in her name to more than 150 companies based in the U.S. and UK. The researcher reported that 24 percent of the companies surveyed ultimately provided personal information in response to the bogus requests.
While the researcher’s study focused on the GDPR, the results are indicative of concerns applicable more broadly to other privacy laws that grant access rights to individuals, including the forthcoming California Consumer Privacy Act (“CCPA”) in California. This could be particularly problematic in a CCPA context given that the statute defines personal information to include information associated with a consumer’s “household.”
The whitepaper associated with the researcher’s study suggests a number of potential steps that various stakeholders could take to remediate the risk of unauthorized disclosure of personal information in response to access requests. For instance, the whitepaper suggests that legislators and regulators could reduce these risks by “assuring businesses that rejecting a suspicious right of access request in good faith will not later result in prosecution if it turns out that the request originated from a legitimate but suspiciously-behaving data subject.”
In a previous post, this blog reported on German guidance on the scope of the right of access under Art. 15 of the GDPR and in particular on the right to receive a copy. The Supervisory Authority of Hesse region stated that the term “copy” in Art 15 GDPR should not be understood literally but rather in the sense of a “summary”.
This somewhat relaxed interpretation appears to conflict with an earlier decision of the Labor Appeals Court of Stuttgart which ordered an employer to provide actual copies of all information held by the company regarding an employee’s performance and behavior to that employee.
More recently, the Appeal Court of Cologne held that the customer of an insurance company is entitled to access all personal data pertaining to him and processed by the company, including any internal notes regarding conversations between company employees and the customer. The company argued that it was impracticable to compile the information due to the large amounts of customer information processed by it. The court was unimpressed, stating that the company was compelled to adapt its IT systems to the requirements of the GDPR. The court did not explicitly rule on the customer’s right to also receive a copy of his personal data.
These first court decisions on Art. 15 of the GDPR confirm that the right of access is becoming a powerful tool in litigation. Germany’s code of civil procedure does not provide for a general right to discovery. The right of access could make up for this and significantly affect outcomes in civil and labor law cases.
On July 25, New York Governor Andrew Cuomo signed two data security and breach notification bills into law. The first bill, the “Stop Hacks and Improve Electronic Data Security Act” or “SHIELD Act,” will impose specific data security requirements on businesses that own or license private information of New York residents, in addition to amending New York’s data breach notification statute to broaden the circumstances under which notification may be required. The second bill, meanwhile, will require consumer reporting agencies to offer identity theft prevention and mitigation services. Both bills are described in further detail below. Continue Reading
On July 25, 2019, the UK’s Information Commissioner’s Office (“ICO”) published a blog on the trade-offs between different data protection principles when using Artificial Intelligence (“AI”). The ICO recognizes that AI systems must comply with several data protection principles and requirements, which at times may pull organizations in different directions. The blog identifies notable trade-offs that may arise, provides some practical tips for resolving these trade-offs, and offers worked examples on visualizing and mathematically minimizing trade-offs.
The ICO invites organizations with experience of considering these complex issues to provide their views. This recent blog post on trade-offs is part of its on-going Call for Input on developing a new framework for auditing AI. See also our earlier blog on the ICO’s call for input on bias and discrimination in AI systems here.
On July 16, 2019, the UK’s Information Commissioner’s Office (“ICO”) released a new draft Data sharing code of practice (“draft Code”), which provides practical guidance for organizations on how to share personal data in a manner that complies with data protection laws. The draft Code focuses on the sharing of personal data between controllers, with a section referring to other ICO guidance on engaging processors. The draft Code reiterates a number of legal requirements from the GDPR and DPA, while also including good practice recommendations to encourage compliance. The draft Code is currently open for public consultation until September 9, 2019, and once finalized, it will replace the existing Data sharing code of practice (“existing Code”).
On July 29, 2019, the Court of Justice of the European Union (“CJEU”) handed down its judgment in the Fashion ID case (Case C-40/17). The CJEU found that when a website operator embeds Facebook’s “Like” button on its website, Facebook and the website operator become joint controllers. The case clarifies the relationship between website operators and social networking sites whose plug-ins are embedded into websites for user tracking and online marketing purposes. The ruling is expected to influence the contractual terms that companies will need to have in place when embedding such social plug-ins to their websites, and may also have ramifications for adtech practices more generally.
The Fashion ID case arose out of a 2015 complaint made by a German consumer protection association, Verbraucherzentrale NRW, against an online clothes retailer, Fashion ID, which embedded Facebook’s “Like” button on its website. Facebook’s “Like” button is a social plug-in that allows website users to click the “Like” button to show on their Facebook profile that they “like” a certain product or service. Websites use this plug-in to optimize their advertising on Facebook so that targeted ads can be shown to people who “like” their products.
Websites with the “Like” button collect information (e.g., IP addresses and browser string data) about not only the people who click the “Like” button, but also other website users who do not click the button, as well as those that do not have a Facebook account. This data is then transferred to Facebook.
The complaint filed by Verbraucherzentrale NRW alleged that Fashion ID’s use of the Facebook “Like” button breached EU data protection law because Fashion ID failed to appropriately inform users and obtain their consent to transfer personal data to Facebook. The complainant sought an injunction by the court to order Fashion ID to stop using the functionality.
The Oberlandesgericht Düsseldorf (Higher Regional Court, Düsseldorf,
Germany) referred the matter up to the CJEU, asking a number of questions seeking clarification as to several provisions of the Data Protection Directive 95/46/EC (which continue to have relevance under the EU’s General Data Protection Regulation), most notably:
- Can Member State laws implementing the Data Protection Directive allow consumer protection organisations to lodge data protection claims on behalf of affected individuals?
The CJEU decided that the provisions of the Data Protection Directive on “judicial remedies, liability and sanctions” give Member States the freedom to determine the “appropriate means” to ensure their application, which could extend to allowing consumer protection organizations to act on behalf of individuals whose data privacy rights have been impinged. The CJEU also mentioned that this redress mechanism is now explicitly provided for under Art. 80 of the GDPR.
- Is the website (i.e., Fashion ID) a “joint controller” in relation to the data that Facebook collects about users?
Significantly, the CJEU decided that Fashion ID and Facebook are “joint controllers” in relation to Facebook’s collection and sharing of personal data. According to the CJEU, by embedding the plug-in on its website, Fashion ID is “influencing” the collection and sharing of data and is “at least tacitly” consenting to it. The CJEU decided that Fashion ID’s responsibility is most apparent in situations where users do not have an account with Facebook, but their data is nonetheless shared with Facebook as a result of accessing Fashion ID’s website. The CJEU also determined that Fashion ID’s lack of access to the data is irrelevant when assessing “joint controllership” (consistent with earlier CJEU cases C-210/16 and C-25/17).
However, the CJEU clarified that although the term “controller” should be given a broad interpretation, an organization cannot be held responsible for upstream or downstream processing operations in the processing chain for which it does not determine the purpose or the means of processing. In this regard, the CJEU held that Facebook (not Fashion ID) is the controller for the processing that takes place after the personal data related to the “Like” plug-in has been transferred to Facebook.
- Can Fashion ID and Facebook rely on their legitimate interests to collect and share personal data?
The CJEU did not give a clear answer to this question, but merely stated that both Fashion ID and Facebook would need to establish a legitimate interest, if they were intending to rely on this legal basis.
- Who has responsibility to (i) provide notice to users about how the data is collected and used and (ii) to collect consent from the users?
The CJEU decided that it is the website operator’s responsibility to provide notice to users and to obtain their consent. However, the website operator only needs to inform users and obtain their consent for processing operations for which it is a “joint controller”.
This ruling mirrors the court’s findings in the Wirtschaftsakademie case (Case C-210/16), where the CJEU found that Wirtschaftsakademie, which offers educational services through a fan page hosted on Facebook, was a joint controller with Facebook for the processing of user website usage data through the “Facebook Insights” tool. The CJEU’s reasoning in both cases provides useful guidance on how the court identifies “controllers” and “joint controllers” in data sharing relationships. The CJEU’s findings suggest that companies using third party tools (e.g., cookies, plug-ins and other website analytics tools) to increase their online visibility may need to ramp up their disclosures to website users and strengthen the contractual terms they have in place with their advertising partners.
On July 22, 2019, the Italian supervisory authority for data protection (“Garante”) issued a judgment involving the so-called “right to be forgotten”. The Garante’s decision explores the boundaries of this right in a case in which Internet users could access an article by using a professional position as a search term, whereas it was not possible to access the article merely by using an individual’s name as a search term.
More specifically, the case before the Garante involved a professional, namely the president of a cooperative, who requested that Google remove a link to online content about him accessible by Internet users. The content was accessible not by entering the individual’s name as a search term, but rather by entering his position as president of the cooperative, an association that serves the interests of members, i.e., social or economic needs or other general aims.
On July 24, 2019, the European Commission (“the Commission”) published a report appraising Europe’s progress in implementing the General Data Protection Regulation (“GDPR”) as a central component of its revamped data protection framework. In its report, the Commission highlights certain achievements resulting from implementation efforts, calls attention to issues that require further action, and describes several ongoing and planned initiatives. The report is a follow-up to a prior report issued in January 2018, and was informed to a great extent by the ongoing work of the Multi-stakeholder Group, which is comprised of civil society and business representatives, academics and practitioners, to support the application of the GDPR. The report will contribute to the Commission’s formal 2-year review of the GDPR to take place in May 2020.