UK Government Publishes Initial Consultation Response on the Online Harms White Paper

On February 12, 2020, the UK Home Office and Department for Digital, Culture, Media & Sport published the Government’s Initial Consultation Response (“Response”) to feedback received through a public consultation on its Online Harms White Paper (“OHWP”).  The OHWP, published in April 2019, proposed a comprehensive regulatory regime that would impose a “duty of care” on online services to moderate a wide spectrum of harmful content and activity on their services, including child sexual abuse material, terrorist content, hate crimes, and harassment.

While the Response does not indicate when the Government expects to introduce proposed legislation, it provides clearer direction on a number of aspects of the proposed regulatory framework set out in the OHWP, including: Continue Reading

Seventh Circuit Adopts Narrow Interpretation of TCPA Autodialer Definition, Deepening Circuit Split

The Seventh Circuit has issued a unanimous decision in Gadelhak v. AT&T Services, adopting a narrow interpretation of a key definitional term in the Telephone Consumer Protection Act (TCPA).  This decision is in line with a recent ruling from the Eleventh Circuit (which we analyzed here) but departs from the Ninth Circuit’s approach—deepening a circuit split that increases the possibility the Supreme Court will review the issue. Continue Reading

European Commission Presents Strategies for Data and AI (Part 1 of 4)

On 19 February 2020, the European Commission presented its long-awaited strategies for data and AI.  These follow Commission President Ursula von der Leyen’s commitment upon taking office to put forward legislative proposals for a “coordinated European approach to the human and ethical implications of AI” within the new Commission’s first 100 days.  Although the papers published this week do not set out a comprehensive EU legal framework for AI, they do give a clear indication of the Commission’s key priorities and anticipated next steps.

The Commission strategies are set out in four separate papers—two on AI, and one each on Europe’s digital future and the data economy.  Read together, it is clear that the Commission seeks to position the EU as a digital leader, both in terms of trustworthy AI and the wider data economy.

Continue Reading

California Introduces Bill to Regulate Automated Decision Systems

On February 14, 2020, California State Assembly Member Ed Chau introduced the Automated Decision Systems Accountability Act of 2020, which would require any business in California that provides a person with a program or device that uses an “automated decision system” (“ADS”) to establish processes to “continually test for biases during the development and usage of the ADS” and to conduct an impact assessment on that program or device.

ADS is defined broadly as “a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts persons.”  The required ADS impact assessments would study the various aspects of the ADS and its development process, “including, but not limited to, the design and training data of the ADS, for impacts on accuracy, fairness, bias, discrimination, privacy, and security.”  At minimum, the assessments must include “[a] detailed description of the ADS, its design, training provided on its use, its data, and its purpose” and “[a]n assessment of the relative benefits and costs of the ADS in light of its purpose,” with certain factors such as data minimization and risk mitigation required in the cost-benefit analysis.

The provider of the ADS also must determine whether the ADS system “has a disproportionate adverse impact on a protected class,” examine whether it serves “reasonable objectives and furthers a legitimate interest,” and consider alternatives or reasonable modifications that could be incorporated “to limit adverse consequences on protected classes.” Continue Reading

Sen. Kirsten Gillibrand Proposes New Digital Privacy Agency

On February 12, 2020, Senator Kirsten Gillibrand (D-NY) announced a plan to create a new Data Protection Agency through her proposed legislation, the Data Protection Act of 2020 (S.3300).

Under the proposal, the new agency would replace the Federal Trade Commission (FTC) as the “privacy cop on the beat.”  As such, the FTC’s current authority in the privacy space—including its ability to draft guidelines, conduct studies, and issue implementing regulations for certain federal privacy laws, would be transferred to the new agency.

As opposed to the Online Privacy Act, a bill introduced by Representatives Anna Eshoo (D-CA-18) and Zoe Lofgren (D-CA-19) that also would create a new privacy agency, Sen. Gillibrand’s bill would not create a new omnibus federal privacy law.  Instead, it is focused on the creation of the Data Protection Agency and its rulemaking authority.  However, various aspects of the new agency’s authority provide valuable insights into what privacy regulation at the federal level might look like under the bill. Continue Reading

Centre for Data Ethics and Innovation Publishes Final Report on “Online Targeting”

On February 4, 2020, the United Kingdom’s Centre for Data Ethics and Innovation (“DEI”) published its final report on “online targeting” (the “Report”), examining practices used to monitor a person’s online behaviour and subsequently customize their experience.  In October 2018, the UK government appointed the DEI, an expert committee that advises the UK government on how to maximize the benefits of new technologies, to explore how data is used in shaping peoples’ online experiences. The Report sets out its findings and recommendations. Continue Reading

French Supervisory Authority Publishes Guidance for Website and App Developers

On January 27, 2020, the French Supervisory Authority (“CNIL”) issued a guidance for developers of websites and applications which sets out the main principles of the General Data Protection Regulation (“GDPR”), expounds on their application in the online environment, and gives practical tips to help developers respect users’ privacy when deploying websites and apps.

The guidance consists of 17 recommendations, each covering a key principle supported by additional advice and examples.  Below, we list all 17 of these recommendations and provide a brief summary of the CNIL’s advice related to each.

Continue Reading

German Federal Commissioner for Data Protection and Freedom of Information Launches Public Consultation on Anonymization

On February 10, 2020, Germany’s Federal Commissioner for Data Protection and Freedom of Information (BfDI) launched its first public consultation procedure.  The consultation invites comments on a position paper of the BfDI which addresses the anonymization of personal data under the General Data Protection Regulation (GDPR), with a particular focus on the telecommunications sector (for example, the anonymization of location data in mobile networks).

The position paper points out that the processing of anonymized data is not regulated by the GDPR, although the GDPR does not make clear under what circumstances data can be considered fully “anonymous”.  Moreover, the steps necessary to anonymize personal data may constitute a form of “processing” that, in and of itself, requires a legal basis under the GDPR.  Hence, the public consultation addresses the following questions:

  • What are the requirements for personal data to be anonymized?
  • Does anonymization constitute processing of personal data that requires a legal basis?
  • If so, what legal basis can be used for anonymization efforts?

The draft position paper proposes the following answers:

  • For personal data to be anonymized, the link to a person must be removed in such a way that re-identification is practically impossible – i.e., the link to the individual can only be restored with a disproportionate expenditure of time, costs and manpower. The controller remains responsible to continuously monitor the validity of the anonymization efforts.
  • Anonymization, including through aggregation of data, is a form of processing of personal data that does indeed require a legal basis.

The paper also sets out a number of possible legal bases for such anonymization efforts, in particular:

  • 6(4) GDPR (i.e., processing of personal data for a new purpose that is compatible with the original purpose for which they were collected) is one option. For example, it could relied on if customer data collected under Art. 6(1)(b) GDPR (performance of an agreement) that did not include any “particularly sensitive” data and is anonymized for the purpose of optimizing services.
  • Under the German Telecommunications Act, location data can be used to provide value-added services (Dienste mit Zusatznutzen – i.e., location-based services) if the user consents to this, or if the data have been anonymized.
  • Anonymization could also be based on Art. 6(1)(c) GDPR (compliance with a legal obligation) & Art. 17 GDPR (right to erasure), because the legal obligation to erase personal data can be met by anonymizing the data. This also applies to traffic data (Verkehrsdaten) collected pursuant to sec. 96(1), 2nd sentence of the German Telecommunications Act.

Interested stakeholders may submit comments via email to konsultation@bfdi.bund.de until March 9, 2020.

Cyberspace Administration of China Releases Notice on the Protection of Personal Information in the Fight Against Coronavirus

In response to the recent coronavirus outbreak (“2019-nCoV”), a wide range of Chinese regulators, including many levels of local governments (down to the neighborhood committee level) and local public security bureaus (“PSBs”), have been actively collecting personal information to monitor and potentially mitigate the spread of the outbreak.  For example, Shenzhen PSB has issued a notice requiring residents or visitors to Shenzhen to scan a QR code to fill in personal information, such as their contact details, addresses, travel information, and health status.  The Shanghai Municipal People’s Government also issued a similar notice requiring residents returning to Shanghai from an out-of-town trip or visitors to report a similar set of personal information.

In practice, numerous additional third party entities, including airports, train stations, employers, and landlords, could engage in collecting extensive personal information from travelers or visitors to a particular location or area, due to their own reporting obligations.  For instance, visitors to office buildings may be obliged to report their health status to the landlord or building management.  Also, employers are required to closely monitor the health status of employees if the employers apply to the local government to re-open their offices or factories.

With the widespread practice of information collection for public health purposes, data breaches and misuse of data become a major concern of the public.  For example, it has been reported that travelers from Wuhan to other cities within China have been victims of data breaches after submitting their personal information to transportation entities and local regulators.  A document entitled “List of Individuals Returning to Ningdu From Wuhan” was leaked to various WeChat groups in January 2020 and contained the personal information, including telephone numbers, national identification numbers, and home addresses, of approximately four to five hundred data subjects.  Similar incidents happened across China and the sources of the leaks remain uncertain. Continue Reading

California AG Releases New Draft CCPA Regulations

The California Attorney General has released both clean and redlined versions of proposed modifications to the draft implementing regulations for the California Consumer Privacy Act (“CCPA”). Below is a high-level overview of some key changes:

  1. Service Providers. The modified draft restricts a service provider from processing the personal information it receives from a business except in the following five circumstances: (1) performing services in the contract with the business that provided the personal information, (2) engaging a different service provider as a subcontractor, (3) using the data internally to build or improve the quality of its services (to the extent that use does not include building or modifying household or consumer profiles, or cleaning or augmenting data acquired from another source); (4) detecting data security incidents or protecting against fraudulent or illegal activity; or (5) processing in accordance with certain exemptions to the CCPA. The draft also eliminates the requirement that service providers that receive requests to exercise rights directly from consumers instruct those consumers to submit their requests to the business, instead permitting (without requiring) service providers to respond directly.
  2. Obligations around “selling” data. The modified draft fills a placeholder contained in the last draft for an example “do not sell” button (image of the two options in which the button may appear below).


    The new draft also eliminates the controversial requirement (which wasn’t in the statute) for a business  to pass through opt-out-of-sale requests to all parties to which the business sold a consumer’s personal information in the 90 days before the consumer exercised his or her right. However, the modified draft also contains a new requirement that businesses comply with a consumer’s opt-out request within 15 business days. Furthermore, if the business sells personal information to a third party after the consumer submitted his or her request, but before the business complied with it, the modified draft regulations require the business to notify those third parties of the consumer’s exercise of the opt-out right and direct those third parties not to sell the personal information. The modified draft regulations also allow businesses that do not collect information directly from consumers to include a link to a privacy policy with instructions on how to submit an opt-out request in their registration under the state’s new data broker law. Finally, the modified draft clarifies that any privacy control developed to submit opt-out-of-sale requests must clearly communicate the user’s intent to opt out of sales and shall not be designed with any pre-selected settings.  (Importantly, the draft regulations focus on user-enabled privacy settings that control the “sale” of personal information, which are, by definition, distinct from “do not track” settings that control the collection of personally identifiable information about an individual consumer’s online activities over time and across third-party websites or online services.)
  3. Access and Deletion Rights. The modified draft regulations make permissive the previous iteration’s requirement that deletion requests be submitted through a two-step process. They also clarify additional circumstances when a business need not search for information in response to a request to know. The new version incorporates the statutory amendment for when a toll-free telephone number is not necessary and explains that authorized agents must be registered to conduct business in California.
  4. Mobile. The modified draft regulations also explicitly address mobile technology. For example, if a business provides an application that collects personal information from a consumer’s device in an unexpected manner, the application has to provide just-in-time notice.
  5. Household. The modified draft regulations also change the requirements around household information. They permit businesses to respond to access and deletion requests related to household information from non-account holding consumers, only if all consumers of the household jointly request access, the business individually verifies them, and the business verifies that each member making the request is a current member of the household. If there’s a child younger than 13 in the household, a business must obtain verifiable parental consent under the regulations before complying with a request.
  6. Notice. The modified draft regulations specify that online notices must follow generally recognized industry standards to be accessible to consumers with disabilities. They also now more clearly emphasize that businesses have flexibility in the specific formatting of their notices. The new version requires more explicit notices in the employment context.
  7. Scope of Personal Information. The modified draft regulations explain that whether information is “personal information” depends on whether it is maintained in a manner in which it is reasonably capable of being associated with a particular consumer. The modified draft regulations then explicitly note that if a business collects an IP address but does not link the IP address to a consumer or household, then that IP address is not “personal information” under the statute.
  8. Minors. The modified draft regulations clarify that a business has to develop documented procedures to collect consent for the sale of minors’ personal information only if the business sells that personal information.
LexBlog