This month, the U.S. Department of Health and Human Services (“HHS”) issued guidance waiving enforcement of certain provisions of the Health Insurance Portability and Accountability Act (“HIPAA”) in response to the COVID-19 nationwide public health emergency. Continue Reading
On March 12, 2020, Washington’s state legislature passed SB 6280, a bill that will regulate state and local government agencies’ use of facial recognition services (“FRS’s”). The bill aims to create a legal framework by which agencies may use FRS’s to the benefit of society (for example, by assisting agencies in locating missing or deceased persons), but prohibits uses that “threaten our democratic freedoms and put our civil liberties at risk.” Continue Reading
In order to combat the proliferation of COVID-1, several EU Member States have strongly recommended or required that employees engage in teleworking, rather than attend work as normal. In this context, the European Union Agency for Cybersecurity (“ENISA”), on March 15, 2020, issued its “top tips for cybersecurity when working remotely”. Some data protection Supervisory Authorities also have issued guidance on this topic.
Below we provide a list of available guidance issued by ENISA and the EU data Supervisory Authorities and a summary of the security measures they recommend.
Guidance on cybersecurity when working remotely
- Austrian Supervisory Authority
- Croatian Supervisory Authority
- Czech Republic Supervisory Authority
- Danish Supervisory Authority
- Danish Center for Cybersecurity
- Dutch Supervisory Authority
- French Supervisory Authority
- German Federal Office for IT
- German Supervisory Authority of Bavaria
- German Supervisory Authority of Brandenburg
- German Independent Supervisory Authority of Schleswig-Holstein
- Icelandic Supervisory Authority
- Irish Supervisory Authority
- Lichtenstein Supervisory Authority
- Norwegian Supervisory Authority
- Spanish Supervisory Authority
- Polish Supervisory Authority
- Comply with an employer’s IT security policies.
- Protect devices and documents containing personal data from unauthorized access (e.g., store documents in locked drawers and lock doors to work spaces, if possible).
- Lock the screen of devices before leaving them unattended.
- Take care not to lose hardware (e.g., USB sticks) and documents.
- Where possible, use the file system offered by your employer and do not save documents locally. When storing data locally, make sure that the device or file are encrypted.
- Beware of using free cloud storage or email services which may use data for marketing purpose, sell data and may not be appropriately protected.
- Use primarily business email accounts. If that is not possible, make sure that the content and attachments are properly encrypted.
- Before sending an email, verify the recipient is the one who is intended.
- Avoid communicating on social media with colleagues about sensitive business issues.
- Be careful when using (video) chat services for conversations in which you discuss sensitive data. Preferably, use any available secure means of communication (e.g., phone).
- Take confidential calls in closed rooms without unauthorized individuals being present.
- Use effective access controls (such as multi-factor authentication and strong passwords) and, where available, encryption.
- Do not use insecure Wi-Fi connections.
- If possible, use an encrypted VPN to connect to company servers.
- Use updated software (including anti-virus software).
- Back-up data regularly to prevent data loss.
- Report any security incident as soon as possible to the employer.
- Dispose of data securely (e.g., shred it in small pieces).
- Beware of phishing emails.
- Do not install software on your computer from unknown sources.
On March 21, 2020, the data security requirements of the New York SHIELD Act became effective. The Act, which amends New York’s General Business Law, represents an expansion of New York’s existing cybersecurity and data breach notification laws. Its two main impacts on businesses are:
- expanding data breach notification requirements under New York law; and
- requiring businesses to maintain “reasonable safeguards” to protect the “private information” of New York residents.
The Act’s expanded data breach notification requirements went into effect on October 23, 2019, as discussed in our prior blog post. For more information on the “reasonable safeguards” requirement that is now in effect, see our client alert on this topic found here.
Yesterday, the Federal Communications Commission (“FCC”) on its own motion released a Declaratory Ruling to confirm that the COVID-19 pandemic constitutes an “emergency” under the Telephone Consumer Protection Act (“TCPA”); as a consequence, hospitals, health care providers, state and local health officials, and other government officials may lawfully communicate through automated or prerecorded calls (which include text messages) information about the coronavirus and mitigation measures to mobile telephone numbers and certain other numbers (such as those of first responders) without “prior express consent.”
By way of background, absent “prior express consent,” the TCPA prohibits the transmission of an automated or prerecorded call to any mobile telephone number. However, this prohibition is subject to an “emergency purposes” exception. The TCPA does not define what constitute “emergency purposes,” but the FCC’s rules construe the term to mean “calls made necessary in any situation affecting the health and safety of consumers.” Continue Reading
In response to the drastic increase of U.S. employees working remotely, the U.S. Federal Trade Commission (“FTC”) and the U.S. National Institute of Standards and Technology (“NIST”) have both issued guidance for employers and employees on best practices for teleworking securely. In addition, the Cybersecurity and Infrastructure Security Agency (“CISA”) has provided advice on identifying essential workers, including IT and cybersecurity personnel, in critical infrastructure sectors that should maintain normal work schedules if possible. Each set of guidance is discussed in further detail below. Continue Reading
As scientists work around the clock to gain insights into the Corona virus and how to fight it, public and private-sector stakeholders are in discussions to promote the rapid exchange of scientific data. During these discussions, the GDPR acronym inevitably rears its head and casts doubt over what is lawful. The GDPR and national data protection laws can, and often do, complicate the matter of sharing personal data, and health data in particular. We provide some general pointers below to help demystify the GDPR and explain its impact.
- It may be self-evident, but it is still worth noting, that the GDPR does not apply to data about the virus itself, including the genetic sequencing of the virus. The GDPR imposes no restrictions on the sharing of this data.
- The GDPR does, however, apply to the personal data of any living individual, and those who are unfortunate enough to host the virus. The scope of the GDPR is broad. It typically applies to data that has been pseudonymized or coded (e.g., line data where the responsible physician or institute replaced the name of the patient with a code).
- The GDPR does not apply to anonymous data, such aggregated data sets. For example, data sets consisting of “virus genetic sequence and other data related to the virus + age group of the patient (e.g., 60-64) + gender” should generally be outside the scope of the GDPR, as it is not reasonably likely that it can be associated with the underlying individuals.
However, if the above information were to originate from an named hospital with on only one infected patient in this age group, the data could then be personal data, as re-attribution to the person would probably not require much effort. Despite EU data protection laws having been in place for over two decades now, the boundary between personal data and anonymous data is often frustratingly unclear.
- Even if the data sets contain personal data relating to patients, this does not mean that the data cannot be used or shared, and the GDPR contains numerous provisions allowing for this, especially where it involves scientific research. For example:
- the GDPR allows lawfully collected data (e.g., health care data) to be re-used for scientific research, without consent, provided appropriate safeguards are in place, such as key-coding (Art. 5(1)(b) & 89(1) GDPR);
- if the data are not obtained directly from the individual, the GDPR also relaxes the normal transparency requirements. Where patients cannot be informed individually (e.g., because it is impossible or represents a disproportionate burden), it suffices to make information about the research publicly available, for example on a website (Art. 14(5)(b) GDPR). Hospitals that obtain data directly from patients should inform them about the possible use of their data for scientific research, for example, by means of leaflets or notices on patient intake forms (Art. 13(3) GDPR);
- an individual’s right to object to scientific research involving his/her data is restricted; the person in question would have to demonstrate cause for opposing it, and, in any case, the right does not apply where there are strong public interests that will be served by the research (Art. 21(6) GDPR); and
- an individual’s right to request erasure of their data is similarly restricted (Art. 17(1)(c) and 17(3)(d) GDPR).
- Despite these derogations designed to promote research endeavors, the fact remains that the GDPR, in combination with national laws, is a very complex topic to navigate. Among other things, the GDPR allows Member States to maintain stricter rules in the area of health data. This means that the derogations mentioned above may not always apply or may not apply in the same way across the EU. Oftentimes, it will not be the GDPR that restricts the sharing of health data, but rather the stricter and/or ill-adapted national rules that deviate from the GDPR.
While the legal landscape is undoubtedly complex, data privacy regulators are aware of the critical need to exchange data to advance important research aims. As noted in our recent blog post, they do not believe that data protection laws have been an impediment to “national approaches to sharing public health messages; of using the latest technology to facilitate safe and speedy consultations and diagnoses; and of creating linkages between public data systems to facilitate identification of the spread of the virus”. Meanwhile, and in line with this thinking, the European Medicines Agency has called on researchers to pool research and collaborate to combat COVID-19.
Earlier this month, the Governor of Vermont signed into law S.B. 110, which will amend the state’s data breach notification law and create a new student privacy law focused on operators of educational technology services. Notably, the amendments to the state’s data breach notification law will expand the categories of personally identifiable information (“PII”) that may trigger notification obligations to individuals and regulators in the event of a breach to include online account credentials, health and medical information, and biometric and genetic data, among others. The student privacy law will place certain restrictions on how student data can be collected, used, and disclosed by operators of online educational technology services. The new requirements, which will enter into force on July 1, 2020, are discussed in more detail below. Continue Reading
Cardi B might like it, but the Federal Trade Commission (“FTC”) did not. On March 5, 2020, the agency sent Cardi B and other high-profile influencers warning letters alleging that the influencers made inadequate disclosures in their endorsements of Teami tea. The letters followed on the heels of the FTC’s proposed order against Teami, LLC for allegedly making deceptive claims about weight loss and other health benefits in their advertisements and failing to adequately instruct influencers about how to comply with the law when endorsing Teami products. Continue Reading
On March 17, 2020, the Executive Committee of the Global Privacy Assembly (“GPA”) issued a statement on data protection in the context of the COVID-19 pandemic. The GPA is an entity representing data protection and privacy regulators around the globe, formerly known as the International Conference of Data Protection and Privacy Commissioners (“ICDPPC”).
The GPA recognizes the unprecedented challenges being faced to address the spread of COVID-19, and acknowledges that data protection requirements do not stand in the way of tackling such challenges.
According to the GPA, the data protection principles enshrined in virtually all countries with data privacy laws “enable the use of data in the public interest and still provide the protections the public expects” in the context of the COVID-19 outbreak.
Moreover, the statement notes how “health data is considered sensitive across many jurisdictions,” but so far this has not constituted an impediment to “national approaches to sharing public health messages; of using the latest technology to facilitate safe and speedy consultations and diagnoses; and of creating linkages between public data systems to facilitate identification of the spread of the virus”.
The GPA made also clear that it supports efforts by public authorities and healthcare professionals to communicate directly with people, and scientific and government bodies to ensure better coordination of policy responses nationally and globally, and fight the spread of the virus in an effective manner.
The GPA’s statement shows how data privacy regulators are trying to coordinate their guidance at global level, and address the privacy concerns linked with the adoption of measures against the spread of the virus. Covington continues to track EU and global regulatory statements and guidance concerning COVID-19, and readers are invited to view Covington’s COVID-19 resources, available here.