Last week, Senators Amy Klobuchar (D-MN) and Lisa Murkowski (R-AK) introduced the Protecting Personal Health Data Act (S. 1842), which would provide new privacy and security rules from the Department of Health and Human Services (“HHS”) for technologies that collect personal health data, such as wearable fitness trackers, social-media sites focused on health data or conditions, and direct-to-consumer genetic testing services, among other technologies. Specifically, the legislation would direct the HHS Secretary to issue regulations relating to the privacy and security of health-related consumer devices, services, applications, and software. These new regulations will also cover a new category of personal health data that is otherwise not protected health information under HIPAA.
Today, Susan Cassidy, Ashden Fein, Moriah Daugherty, and Melinda Lewis posted an article on Inside Government Contracts about the June 19, 2019 announcement by the National Institute of Standards and Technology (“NIST”) of the long-awaited update to Special Publication (“SP”) 800-171 Rev. 1, Protecting Controlled Unclassified Information in Nonfederal Systems and Organizations.
The update includes separate but related documents: SP 800-171 Rev. 2, which includes minor editorial updates to SP 800-171 Rev. 1; SP 800-171B, Protecting Controlled Unclassified Information in Nonfederal Systems and Organizations: Enhanced Security Requirements for Critical Programs and High Value Assets, which contains recommended enhanced security requirements designed to protect against advanced persistent threats (“APTs”); and the a DoD cost estimate for complying with the requirements of SP 800-171B, Request for Comments on Draft NIST Special Publication (SP) 800-171B, Protecting Controlled Unclassified Information in Nonfederal Systems and Organizations – Enhanced Security Requirements for Critical Programs and High Value Assets.
The article can be read here.
On May 27, 2019, the Thai government published the Personal Data Protection Act B.E. 2562 (2019) (the “PDPA”) in its official gazette, meaning the law now takes effect and companies have a 1-year period to bring their practices into compliance by May 27, 2020.
Notably, the PDPA adopts a broad definition of “personal data” (essentially, any information which directly or indirectly identifies an individual) and an extraterritorial scope that extends its obligations to organizations outside of Thailand who either (i) offer products and services to individuals in Thailand, or (ii) monitor the behavior of individuals in Thailand. The PDPA also adopts the concepts of “controller” and “processor” consistent with various other privacy regimes.
The PDPA requires, among other things, organizations to:
- have a legal basis to collect and use personal information (in some cases requiring consent);
- respect heightened requirements for sensitive personal data;
- implement appropriate security measures and notify data breaches; and,
- facilitate the exercise of rights of individuals relating to their personal data.
Organizations which meet certain criteria may also be required to appoint a data protection officer (“DPO”) and/or a local representative in Thailand.
The PDPA establishes the Personal Data Protection Committee (“PDPC”), which will enforce the law and publish guidance to help organizations ensure compliant practices. Violations of the PDPA may result in administrative fines, civil damages (including punitive damages), and the possibility for criminal prosecution.
Additional legislation will be published in the near future to further specify certain requirements of the PDPA, as well as to align national legislation appropriately.
On 30 May 2019, the United Kingdom’s ICO released a report, “GDPR: One Year On”, discussing the impact of the GDPR and its associated learnings after one year following its implementation (the “Report”), which provides valuable insight into the enforcement practices, EU-wide cooperation, support functions, innovative practices and further growth plans of the ICO. The contents of the Report will likely prove useful in helping to map out the direction the ICO will take during the course of the coming year and beyond.
As policymakers weigh the implications of artificial intelligence (“AI”) and the Internet of Things (“IoT”), members of Congress have introduced a handful of measures focusing on Government support for and adoption of these emerging technologies.
In May, Senators Deb Fischer (R-NE), Brian Schatz (D-HI), Cory Gardner (R-CO), and Cory Booker (D-NJ) reintroduced the Developing and Growing the Internet of Things (“DIGIT”) Act. An earlier version of the legislation passed the Senate last year, but stalled in the House.
As reintroduced, the DIGIT Act would convene a working group of federal entities that would consult with private sector stakeholders to provide Congress with recommendations to encourage the growth of Internet of Things (“IoT”) technologies. Specifically, and among other measures, the bill would require the working group to:
- identify governmental activities that inhibit or could inhibit the growth of IoT
- consider policies or programs that encourage and improve coordination among federal agencies relevant to IoT
- examine how federal agencies can benefit from IoT, the IoT technologies currently used by agencies, and how prepared agencies are to adopt new IoT technologies
- consider any additional security measures federal agencies may need to take to safely and securely use IoT and enhance the resiliency of federal systems against cyber threats to IoT
The working group would include governmental entities, who would be directed to consult with non-governmental stakeholders, including industry representatives from non-technology companies, in the transportation, energy, agriculture, or health care sectors. The DIGIT Act would also create a steering group of private entities to advise the working group. The working group would be required to submit a report to Congress within 18 months of the Act’s enactment.
The DIGIT Act would also require the Federal Communications Commission (“FCC”) to study and provide a report to Congress on the spectrum needs to support an IoT ecosystem.
Two other new federal bills would also support new uses of AI technologies. The AI in Government Act of 2019 (H.R. 2575), sponsored by Rep. Jerry McNerney (D-CA-9), would create an AI Center of Excellence to advise and promote efforts to develop innovative uses of AI by the federal Government. In the Senate, the Artificial Intelligence Initiative Act (S. 1558), sponsored by Sen. Martin Heinrich (D-NM), would establish a coordinated federal initiative to accelerate research and development of AI.
On June 13, 2019, the Cyberspace Administration of China (“CAC”) issued the draft Measures on Security Assessment of the Cross-border Transfer of Personal Information (“Draft Measures”) for public comment. (The official Chinese version of the Draft Measures is available here, and an unofficial English translation is available here.) The comment period ends on July 13, 2019.
The issuance of the Draft Measures marks another major development in the implementation of China’s Cybersecurity Law (“CSL”) over the past month, aiming to create a cross-border data transfer mechanism that would govern all of the transfers of personal information conducted by network operators (defined as “owners and managers of networks, as well as network service providers”).
CAC has previously released two earlier versions of its draft Measures on Security Assessment of Cross-border Transfer of Personal Information and Important Data back in 2017, which imposed security assessment obligations on network operators when they transfer both personal information and important data outside of China (See Covington’s previous alert here). The latest and long-anticipated Draft Measures only focus on the cross-border transfer of personal information (the cross-border transfer of important data will be subject to a separate approval mechanism introduced by the draft Measures for Data Security Management released by CAC on May 28, 2019) and also set out new requirements that bear resemblance to the Standard Contractual Clauses under the EU’s General Data Protection Regulation (“GDPR”).
We discuss the key requirements of the Draft Measures in a greater detail below.
On May 29, 2019, the Governor of Nevada signed into law Senate Bill 220 (“SB 220”), an act relating to Internet privacy and amending Nevada’s existing law requiring websites and online services to post a privacy notice. In short, Nevada’s law will require operators of Internet websites and online services to follow a consumer’s direction not to sell his or her personal data. The Nevada law differs from the California Consumer Privacy Act (“CCPA”) enacted last year in notable ways, and could signal the coming of a patchwork of fifty-plus different data privacy standards across the country, much like the state data breach notification laws.
Unlike the CCPA (which applies to both online and offline business operations), SB 220 applies only to operators of Internet websites and online services, and defines “operators” as people who (1) own or operate an Internet website or online service for commercial purposes; (2) collect and maintain covered information from consumers who reside in Nevada and use or visit the Internet website or online service; and (3) engage in any activity that constitutes a sufficient nexus with Nevada to satisfy the requirements of the United States Constitution. Such activity includes purposefully directing activities toward Nevada, consummating a transaction with Nevada or a Nevada resident, or purposefully taking advantage of the privilege of conducting activity in Nevada. SB 220 does not apply to the following entities: an entity that is regulated by the Gramm-Leach-Bliley Act or the Health Insurance Portability and Accountability Act; a service provider to an operator; or a manufacturer of a motor vehicle or a person who services a motor vehicle who processes covered information that is either (1) retrieved from a motor vehicle in connection with a technology or service related to the motor vehicle, or (2) provided by a consumer in connection with a subscription or registration for a technology or service related to the motor vehicle. Continue Reading
On June 3, 2019, the UK Information Commissioner’s Office (“ICO”), released an Interim Report on a collaboration project with The Alan Turing Institute (“Institute”) called “Project ExplAIn.” The purpose of this project, according to the ICO, is to develop “practical guidance” for organizations on complying with UK data protection law when using artificial intelligence (“AI”) decision-making systems; in particular, to explain the impact AI decisions may have on individuals. This Interim Report may be of particular relevance to organizations considering how to meet transparency obligations when deploying AI systems that make automated decisions that fall within the scope of Article 22 of the GDPR.
On May 31, 2019, the Cyberspace Administration of China (“CAC”) released the draft Regulation on the Protection of Children’s Personal Information Online (“Draft Regulation”) for public comment. (An official Chinese version is available here and an unofficial English translation of the Draft Regulation is available here.) The comment period ends on June 30, 2019.
As mentioned in our last blog post (available here), CAC issued the draft Measures for Data Security Management (“Draft Measures”) just last week, which set out the general regulatory framework that will govern the collection and use of personal information by network operators (broadly defined as “owners and managers of networks, as well as network service providers”). The release of this new Draft Regulation demonstrates CAC’s intention to set out more stringent requirements for network operators if they collect, store, use, transfer or disclose the personal information of minors under 14 years old. We discuss the key requirements of the Draft Regulation in a greater detail below.
On May 28, 2019, the Cyberspace Administration of China (“CAC”) released the draft Measures for Data Security Management (“Draft Measures”) for public comment. (An official Chinese version of the Draft Measures is available here and an unofficial English translation is available here.) The comment period ends on June 28, 2019.
The release of these Draft Measures demonstrates China’s continuing efforts to implement the data protection requirements imposed by China’s Cybersecurity Law (“CSL”). For example, under Article 41 of the CSL, network operators must notify individuals of the purposes, methods and scope of the information collection and use, and obtain their consent before collecting or using individuals’ persona information. Furthermore, under Article 42 and 43 of the CSL, network operators must not disclose, tamper with, or damage citizens’ personal information that they have collected and they are obligated to delete unlawfully collected information and amend incorrect information.
To implement the CSL, the CAC and the Standardization Administration of China issued a national standard for personal information protection (“Standard”) on January 2, 2018, which took effect on May 1, 2018 (see our previous blog post about that Standard here). A draft amendment to the Standard (“Draft Amendment”) was released for public comment on February 1, 2019 (see our previous blog post about the Draft Amendment here). The new Draft Measures incorporate some of personal information protection requirements specified in the Standard and the Draft Amendment, and also introduce a number of new requirements for the protection of “important data,” which was initially mentioned in Article 21 and 37 of the CSL, but was not defined.