Centre for Data Ethics and Innovation publishes final report on “online targeting”

On February 4, 2020, the United Kingdom’s Centre for Data Ethics and Innovation (“DEI”) published its final report on “online targeting” (the “Report”), examining practices used to monitor a person’s online behaviour and subsequently customize their experience.  In October 2018, the UK government appointed the DEI, an expert committee that advises the UK government on how to maximize the benefits of new technologies, to explore how data is used in shaping peoples’ online experiences. The Report sets out its findings and recommendations. Continue Reading

French Supervisory Authority Publishes Guidance for Website and App Developers

On January 27, 2020, the French Supervisory Authority (“CNIL”) issued a guidance for developers of websites and applications which sets out the main principles of the General Data Protection Regulation (“GDPR”), expounds on their application in the online environment, and gives practical tips to help developers respect users’ privacy when deploying websites and apps.

The guidance consists of 17 recommendations, each covering a key principle supported by additional advice and examples.  Below, we list all 17 of these recommendations and provide a brief summary of the CNIL’s advice related to each.

Continue Reading

German Federal Commissioner for Data Protection and Freedom of Information Launches Public Consultation on Anonymization

On February 10, 2020, Germany’s Federal Commissioner for Data Protection and Freedom of Information (BfDI) launched its first public consultation procedure.  The consultation invites comments on a position paper of the BfDI which addresses the anonymization of personal data under the General Data Protection Regulation (GDPR), with a particular focus on the telecommunications sector (for example, the anonymization of location data in mobile networks).

The position paper points out that the processing of anonymized data is not regulated by the GDPR, although the GDPR does not make clear under what circumstances data can be considered fully “anonymous”.  Moreover, the steps necessary to anonymize personal data may constitute a form of “processing” that, in and of itself, requires a legal basis under the GDPR.  Hence, the public consultation addresses the following questions:

  • What are the requirements for personal data to be anonymized?
  • Does anonymization constitute processing of personal data that requires a legal basis?
  • If so, what legal basis can be used for anonymization efforts?

The draft position paper proposes the following answers:

  • For personal data to be anonymized, the link to a person must be removed in such a way that re-identification is practically impossible – i.e., the link to the individual can only be restored with a disproportionate expenditure of time, costs and manpower. The controller remains responsible to continuously monitor the validity of the anonymization efforts.
  • Anonymization, including through aggregation of data, is a form of processing of personal data that does indeed require a legal basis.

The paper also sets out a number of possible legal bases for such anonymization efforts, in particular:

  • 6(4) GDPR (i.e., processing of personal data for a new purpose that is compatible with the original purpose for which they were collected) is one option. For example, it could relied on if customer data collected under Art. 6(1)(b) GDPR (performance of an agreement) that did not include any “particularly sensitive” data and is anonymized for the purpose of optimizing services.
  • Under the German Telecommunications Act, location data can be used to provide value-added services (Dienste mit Zusatznutzen – i.e., location-based services) if the user consents to this, or if the data have been anonymized.
  • Anonymization could also be based on Art. 6(1)(c) GDPR (compliance with a legal obligation) & Art. 17 GDPR (right to erasure), because the legal obligation to erase personal data can be met by anonymizing the data. This also applies to traffic data (Verkehrsdaten) collected pursuant to sec. 96(1), 2nd sentence of the German Telecommunications Act.

Interested stakeholders may submit comments via email to konsultation@bfdi.bund.de until March 9, 2020.

Cyberspace Administration of China Releases Notice on the Protection of Personal Information in the Fight Against Coronavirus

In response to the recent coronavirus outbreak (“2019-nCoV”), a wide range of Chinese regulators, including many levels of local governments (down to the neighborhood committee level) and local public security bureaus (“PSBs”), have been actively collecting personal information to monitor and potentially mitigate the spread of the outbreak.  For example, Shenzhen PSB has issued a notice requiring residents or visitors to Shenzhen to scan a QR code to fill in personal information, such as their contact details, addresses, travel information, and health status.  The Shanghai Municipal People’s Government also issued a similar notice requiring residents returning to Shanghai from an out-of-town trip or visitors to report a similar set of personal information.

In practice, numerous additional third party entities, including airports, train stations, employers, and landlords, could engage in collecting extensive personal information from travelers or visitors to a particular location or area, due to their own reporting obligations.  For instance, visitors to office buildings may be obliged to report their health status to the landlord or building management.  Also, employers are required to closely monitor the health status of employees if the employers apply to the local government to re-open their offices or factories.

With the widespread practice of information collection for public health purposes, data breaches and misuse of data become a major concern of the public.  For example, it has been reported that travelers from Wuhan to other cities within China have been victims of data breaches after submitting their personal information to transportation entities and local regulators.  A document entitled “List of Individuals Returning to Ningdu From Wuhan” was leaked to various WeChat groups in January 2020 and contained the personal information, including telephone numbers, national identification numbers, and home addresses, of approximately four to five hundred data subjects.  Similar incidents happened across China and the sources of the leaks remain uncertain. Continue Reading

California AG Releases New Draft CCPA Regulations

The California Attorney General has released both clean and redlined versions of proposed modifications to the draft implementing regulations for the California Consumer Privacy Act (“CCPA”). Below is a high-level overview of some key changes:

  1. Service Providers. The modified draft restricts a service provider from processing the personal information it receives from a business except in the following five circumstances: (1) performing services in the contract with the business that provided the personal information, (2) engaging a different service provider as a subcontractor, (3) using the data internally to build or improve the quality of its services (to the extent that use does not include building or modifying household or consumer profiles, or cleaning or augmenting data acquired from another source); (4) detecting data security incidents or protecting against fraudulent or illegal activity; or (5) processing in accordance with certain exemptions to the CCPA. The draft also eliminates the requirement that service providers that receive requests to exercise rights directly from consumers instruct those consumers to submit their requests to the business, instead permitting (without requiring) service providers to respond directly.
  2. Obligations around “selling” data. The modified draft fills a placeholder contained in the last draft for an example “do not sell” button (image of the two options in which the button may appear below).


    The new draft also eliminates the controversial requirement (which wasn’t in the statute) for a business  to pass through opt-out-of-sale requests to all parties to which the business sold a consumer’s personal information in the 90 days before the consumer exercised his or her right. However, the modified draft also contains a new requirement that businesses comply with a consumer’s opt-out request within 15 business days. Furthermore, if the business sells personal information to a third party after the consumer submitted his or her request, but before the business complied with it, the modified draft regulations require the business to notify those third parties of the consumer’s exercise of the opt-out right and direct those third parties not to sell the personal information. The modified draft regulations also allow businesses that do not collect information directly from consumers to include a link to a privacy policy with instructions on how to submit an opt-out request in their registration under the state’s new data broker law. Finally, the modified draft clarifies that any privacy control developed to submit opt-out-of-sale requests must clearly communicate the user’s intent to opt out of sales and shall not be designed with any pre-selected settings.  (Importantly, the draft regulations focus on user-enabled privacy settings that control the “sale” of personal information, which are, by definition, distinct from “do not track” settings that control the collection of personally identifiable information about an individual consumer’s online activities over time and across third-party websites or online services.)
  3. Access and Deletion Rights. The modified draft regulations make permissive the previous iteration’s requirement that deletion requests be submitted through a two-step process. They also clarify additional circumstances when a business need not search for information in response to a request to know. The new version incorporates the statutory amendment for when a toll-free telephone number is not necessary and explains that authorized agents must be registered to conduct business in California.
  4. Mobile. The modified draft regulations also explicitly address mobile technology. For example, if a business provides an application that collects personal information from a consumer’s device in an unexpected manner, the application has to provide just-in-time notice.
  5. Household. The modified draft regulations also change the requirements around household information. They permit businesses to respond to access and deletion requests related to household information from non-account holding consumers, only if all consumers of the household jointly request access, the business individually verifies them, and the business verifies that each member making the request is a current member of the household. If there’s a child younger than 13 in the household, a business must obtain verifiable parental consent under the regulations before complying with a request.
  6. Notice. The modified draft regulations specify that online notices must follow generally recognized industry standards to be accessible to consumers with disabilities. They also now more clearly emphasize that businesses have flexibility in the specific formatting of their notices. The new version requires more explicit notices in the employment context.
  7. Scope of Personal Information. The modified draft regulations explain that whether information is “personal information” depends on whether it is maintained in a manner in which it is reasonably capable of being associated with a particular consumer. The modified draft regulations then explicitly note that if a business collects an IP address but does not link the IP address to a consumer or household, then that IP address is not “personal information” under the statute.
  8. Minors. The modified draft regulations clarify that a business has to develop documented procedures to collect consent for the sale of minors’ personal information only if the business sells that personal information.

Kids’ Privacy Bill Allowing for Private Suits Introduced in House

On January 30, House Rep. Kathy Castor (D-FL) introduced the Protecting the Information of our Vulnerable Children and Youth (“PRIVCY”) Act, a bill that promises to be a significant overhaul of the Children’s Online Privacy Protection Act (“COPPA”).

Currently, COPPA applies only to personal information collected from children under 13 years old.  The PRIVCY Act would greatly expand COPPA’s scope by making any personal information – including biometric, geolocation, and inferred information, whether collected from the child or not – subject to the law’s requirements.  It also brings a new group of “young consumers” – individuals aged 12 to 18 years old – under the law’s umbrella.  The PRIVCY Act would obligate online sites and services that have actual or constructive knowledge that they “process” personal information about children or young consumers to provide notice to, and obtain consent from, those children’s parents or from those young consumers.  The bill also provides for rights to access, correction, and deletion of children’s and young consumers’ personal information, and it imposes limits on the ability of operators to disclose personal information to third parties.

Additionally, the privacy bill would completely repeal COPPA’s safe harbor provision, which enables covered operators to rely on a safe harbor if their privacy practices have been certified by FTC-approved organizations.  Currently, seven safe harbor organizations have been approved by the FTC. Continue Reading

CCPA Cited for the First Time in Litigation

In a complaint filed on Monday involving an alleged data breach, Barnes v. Hanna Andersson, the California Consumer Privacy Act (CCPA)—the State’s comprehensive privacy law that went into effect on January 1, 2020—was cited for what appears to be the first time in a lawsuit.  Importantly, however, the plaintiff in this case has not asserted a claim under the CCPA or alleged a violation of the CCPA as a predicate for a claim under the California Unfair Competition Law (UCL).  This might be because, according to the complaint, the alleged data breach occurred between September­–November 2019, before the CCPA went into effect.

DoD Announces the Release of CMMC Version 1.0

Last Friday, the Department of Defense announced the release of Version 1.0 of its Cybersecurity Maturity Model Certification (“CMMC”), which sets forth the cybersecurity requirements that contractors and suppliers must meet to participate in the Department’s supply chain.  A new post on Covington’s Inside Government Contracts blog discusses the release of Version 1.0 of the CMMC and accompanying remarks from Department of Defense officials.

Germany Publishes Draft Regulation on the Reimbursement of Digital Health Applications

Germany recently enacted a law that enables state health insurance schemes to reimburse costs related to the use of digital health applications (“health apps”), but the law requires the Federal Ministry of Health to first develop the reimbursement process for such apps.  Accordingly, on January 15, 2020, the German government published a draft regulation setting out the procedure for examining the eligibility of health apps to receive insurance reimbursements, as well as the requirements that such health apps must fulfill.

Notably, among its various obligations, the draft regulation and its Annex 1 include a number of data protection and data security requirements that health app developers must comply with if their health apps are to benefit from the reimbursement scheme.

According to the draft regulation, developers must:

  • implement appropriate data protection and security measures, taking into account the state of the art, the categories of personal data processed and the risk level;
  • carry out a Data Protection Impact Assessment;
  • obtain the explicit consent of the patient to process their health data (Art. 9(2) (a) GDPR);
  • not disclose data outside the European Economic Area to countries that do not provide an adequate level of protection of personal data pursuant to an adequacy decision of the European Commission (transfers on the basis of standard contract clauses or BCRs are apparently not allowed);
  • impose an obligation of confidentiality on all persons under its authority that have access to the personal data of the user; and
  • ensure the portability of the personal data.

The patient’s data may be used by the developer of the health app only:

  • for the intended use of the health app and for the reimbursement procedure;
  • to prove the benefit of the application (in the framework of specific procedures regulated under Book V of the Social Security Code);
  • to comply with legal obligations imposed by the EU Medical Devices Regulation 2017/745 and the German Medical Devices Implementation Act, and
  • to ensure, on an ongoing basis, the technical functionality and user-friendliness of the health app.

The health app must be free of advertising and the patient’s data must not be used for advertising purposes whatsoever.

Developers must fill out a detailed checklist (Annex 1 of the draft regulation) explaining how they comply with the above requirements when applying for registration with the Federal Institute for Drugs and Medical Devices (BfArM).

Updates to the draft regulation and the procedure to register a health app for reimbursement will be published on a dedicated page of the BfArM’s website.

Eleventh Circuit Rejects Expansive Interpretation of TCPA Autodialer Definition, Creating Split with Ninth Circuit

The Eleventh Circuit has issued a decision in Glasser v. Hilton Grand Vacations Company that rejects an expansive interpretation of a key definitional term in the Telephone Consumer Protection Act (TCPA)—an interpretation that has been embraced by the Ninth Circuit.  The decision therefore creates a circuit split that could increase the possibility the Supreme Court will review the issue, which has spawned numerous TCPA lawsuits in recent years.

The TCPA imposes consent requirements on phone calls and text messages that are sent using an automatic telephone dialing system (ATDS).  Over a partial dissent, the Eleventh Circuit held that dialing equipment falls within this definition only if it uses randomly or sequentially generated numbers and does not require human intervention.  This holding differs from the Ninth Circuit’s holding in Marks v. Crunch San Diego, LLC, 904 F.3d 1041 (9th Cir. 2018), which embraced a more expansive view of the ATDS definition.

The TCPA defines an ATDS as “equipment which has the capacity—(A) to store or produce telephone numbers to be called, using a random or sequential number generator; and (B) to dial such numbers.”  47 U.S.C. § 227(a)(1).  The statute (with a few exceptions not relevant here) prohibits using an ATDS to place calls or text messages to mobile numbers without the recipient’s prior express consent.

In Glasser, two plaintiffs alleged that they each received more than a dozen unsolicited phone calls to their cell phones over the course of a year from a timeshare marketer and a loan servicer, respectively.  The defendants admitted to placing the calls but argued that they did not violate the TCPA because they had not used an ATDS.  The district court agreed with the timeshare marketer that its dialing equipment was not an ATDS because it required human intervention.  But the lower court found that the loan servicer’s dialing equipment did qualify as an ATDS because it could automatically dial numbers from a stored list.  An appeal to the Eleventh Circuit followed.

The Eleventh Circuit noted that the dispute boiled down to the following question:  does the ATDS definition include dialing equipment that can store telephone numbers and dial them even if a random or sequential number generator is not used (e.g., because the numbers come from a targeted list)?

In a 2-1 decision, the court held that an ATDS must use a random or sequential number generator to store or dial numbers.  It determined this interpretation was more faithful to the statutory text and Congress’s purpose in enacting the TCPA.  In reaching this conclusion, the court provided a cogent recounting of the FCC’s progressive expansion of the scope of the ATDS definition, which the court stated was rejected in ACA International v. FCC, 885 F.3d 687 (D.C. Cir. 2018).  The court noted that the ACA International decision (in which Covington was involved) had “wiped the slate clean.”  The opinion describes how, at the time Congress enacted the TCPA, the law’s aim appeared to be to limit the activities of marketers who used random or sequential dialers, a practice that was viewed as a particular nuisance.  Over time, marketers shifted their practices to use autodialers that dial lists of targeted—rather than random—numbers.  The FCC responded by expanding its interpretation of the ATDS definition.  But the Eleventh Circuit found that this expansive interpretation exceeds both the text and legislative history of the TCPA.

The court acknowledged the conflict between its decision and the Ninth Circuit’s expansive interpretation in Marks, which the majority stated “looks more like ‘surgery.’”  One judge on the Glasser panel dissented from this portion of the opinion, agreeing with the Ninth Circuit’s view.  This disagreement among the Courts of Appeal over the proper interpretation of a federal statute increases the odds that the Supreme Court will be asked in the foreseeable future to resolve what constitutes an ATDS.

Separately, the Eleventh Circuit also held that dialing equipment that requires human intervention does not fall within the ATDS definition.  Specifically, the court found that the defendant’s dialing equipment did not “automatically” dial numbers because human employees developed the parameters regarding whom to contact and had to click a “make call” button in order to place a call.  The court found that these facts distinguished this case from Marks.  Notably, this part of the decision was unanimous, perhaps suggesting a growing judicial acceptance of this view, which multiple district courts also have adopted.

LexBlog