Covington Hosts First Webinar on Connected and Automated Vehicles

On February 27, 2019, Covington hosted its first webinar in a series on connected and automated vehicles (“CAVs”).  During the webinar, which is available here, Covington’s regulatory and public policy experts covered the current state of play in U.S. law and regulations relating to CAVs.  In particular, Covington’s experts focused on relevant developments in: (1) federal public policy; (2) federal regulatory agencies; (3) state public policy; (4) autonomous aviation; and (5) national security.

Highlights from each of these areas are presented below.

Continue Reading

Republicans, Democrats Offer Different Views on Preemption During Senate Privacy Hearing

At a February 27, 2019 hearing on “Privacy Principles for a Federal Data Privacy Framework in the United States,” Republican and Democratic members of the Senate Commerce, Science, & Transportation Committee offered different perspectives on whether new federal privacy legislation should preempt state privacy laws.

Continue Reading

House Subcommittee Holds Initial Hearing On Potential New Privacy Bill

On February 26, 2019, a key House subcommittee held a hearing to explore the possible contours of new federal privacy legislation.  At the hearing, Rep. Jan Schakowsky (D-IL)—who chairs the Energy & Commerce Committee’s Subcommittee on Consumer Protection and Commerce—said the hearing on “Protecting Consumer Privacy in the Era of Big Data” was only the first of “several hearings” that she would organize on consumer privacy.

Continue Reading

GAO Report Calls for Federal Privacy Law

This month, the Government Accountability Office (“GAO”) released a report recommending that Congress consider enacting a federal internet privacy law in the United States.  The 56-page independent report was requested by the House Energy and Commerce Committee, which has scheduled a hearing on data privacy on February 26, during which it plans to discuss the GAO’s findings.  The Senate Commerce Committee is scheduled to hold a similar hearing on February 27th.

According to the GAO, “Congress should consider developing comprehensive legislation on Internet privacy that would enhance consumer protections and provide flexibility to address a rapidly evolving Internet environment.”  The GAO stressed the importance of striking an appropriate balance between the benefits of data collection and addressing consumer concerns.

Continue Reading

All-Time Record Year for HIPAA Enforcement

The U.S. Department of Health and Human Services Office for Civil Rights (“OCR”) announced that 2018 was an all-time record year for Health Insurance Portability and Accountability Act (“HIPAA”) enforcement activity.   Enforcement actions in 2018 resulted in the assessment of  $28.7 million in civil money penalties.  Enforcement activity focused primarily on breaches of electronic protected health information (ePHI).

Under 45 C.F.R. 164.308, a covered entity must conduct “accurate and thorough assessment[s] of the potential risks and vulnerabilities . . . of [ePHI].”  The final settlement of the year occurred in December 2018. In that settlement, Cottage Health agreed to pay $3 million to OCR and agreed to adopt a corrective action plan to remedy violations of the HIPAA Rules. The alleged violations pertained to December 2013 and December 2015 compromises of unsecured ePHI that implicated data of over 62, 500 individuals. The ePHI breached included patient names, addresses, dates of birth, Social Security numbers, diagnoses, conditions, lab results, and other treatment information.  OCR concluded that Cottage Health failed to conduct risk assessments and failed to implement security measures to reduce vulnerabilities.  In September 2018, OCR settled with Advanced Care Hospitals (ACH), a contractor physician group, for $500,000 after ACH reported that ACH patient information was viewable on a medical billing services’ website.  The OCR investigation revealed that ACH lacked the required business associate agreement with the billing service provider, that it had not conducted a risk assessment, and that it had not implemented security measures or HIPAA policies or procedures before 2014.  And, in October 2018, Anthem, Inc. paid $16 million (the largest HIPAA penalty ever assessed by OCR) after the largest health data breach in history.  Anthem discovered that malicious actors accessed its network through undetected, continuous and targeted attacks to extract data and had infiltrated the system through spear phishing emails.

Another enforcement theme in 2018 focused on physical theft of PHI or devices containing ePHI.  In January 2018, OCR settled with a medical records maintenance, storage, and delivery services provider, Filefax, Inc., after finding that Filefax left PHI in an unlocked truck in the Filefax parking lot and granted permission to unauthorized individuals to remove PHI.   Additionally, in June 2018, an Administrative Law Judge ruled in favor of OCR and required the University of Texas MD Anderson Cancer Center to pay $4.3 million in civil penalties for HIPAA violations after a theft of an unencrypted laptop from the residence of an employee and the loss of two USB thumb drives.

OCR’s record-breaking enforcement activities in 2018 serve as a reminder to covered entities and business associates to conduct frequent and meaningful assessment of the security of any PHI they hold, to swiftly remediate any vulnerabilities discovered, and to carefully document the assessment, remediation, and general HIPAA policies and procedures.

This blog post is part of our ongoing coverage of HIPAA issues, which includes, among others:

Covington to Host Webinar on Connected and Automated Vehicles

One week from today, Covington will host its first webinar in a series on connected and automated vehicles (“CAVs”). The webinar will take place on February 27 from 12 to 1 p.m. Eastern Time. During the webinar, Covington’s regulatory and legislative experts will cover developments in U.S. law and regulations relating to CAVs. Those topics include:

  • Federal regulation affecting CAVs, with a focus on the National Highway Traffic Safety Administration (“NHTSA”), the Federal Aviation Administration (“FAA”), the Federal Communications Commission (“FCC”), and the Committee on Foreign Investment in the United States (“CFIUS”) review.
  • Where Congress stands on CAV legislation, including the AV START Act, the SELF DRIVE Act, and infrastructure legislation.
  • State-level legislative, regulatory, and policy developments, including a closer look at California’s regulations.
  • Updates and trends specific to the autonomous aviation industry.
  • Foreign investment and export controls impacting CAVs.

Our speakers are:

  • Holly Fechner (Legislative/Public Policy, Former Senate Policy Director)
  • Brian Smith (Public Policy/Aviation, Former White House Counsel’s Office, Former Special Assistant, Department of Labor)
  • Sarah Wilson (Product Liability/Consumer Safety, Former Federal Claims Judge, Former White House Senior Counsel)
  • Jake Levine (State Regulation/Public Policy, Former Senior Counsel to CA State Senator Fran Pavley, Former White House Policy Advisor)
  • Jonathan Wakely (CFIUS/International Trade, Former CIA Political Analyst)

You can register for the webinar here.

Please check back here for details on our next webinar in this series: Leveraging AV Data in a Connected World.

The Court of Justice of the European Union reiterates broad application of the EU Data Protection Law’s journalism exception to online platforms

On January 14, 2019, the Court of Justice of the European Union (“CJEU”) decided that video recordings of police officers in the exercise of their duties and the uploading of such videos on YouTube may constitute “journalistic activities” in the meaning of the journalism exception of the EU Data Protection Directive (“Directive”) (available here).

The claimant in the present case recorded police officers on active duty in their police station and uploaded the video on YouTube.  The Latvian Supervisory Authority ordered the claimant to remove the video claiming that both the recording and the online disclosure violated Latvian data protection law.  Eventually, the Latvian Supreme Court referred the matter to the CJEU.

The CJEU decided that the storage of such video on a recording device and the subsequent online disclosure of the video on a social platform constitute data processing activities that fall within the scope of the Directive.

The CJEU also decided that the online disclosure of the video may fall under the definition of “journalistic activities” established by the CJEU in its Satamedia decision of 2008, which broadly defined the concept of “journalistic activities” to include “disclosure to the public of information, opinions or ideas, irrespective of the medium which is used to transmit them.”  According to the CJEU, the fact that the claimant in this case is not a professional journalist does not necessarily exclude him from benefiting from the journalism exception.  Furthermore,  it was unimportant to the court that the recording depicted wrongdoings by police officers.

The court did limit the journalism exception by indicating that it only applies if (i) the video is solely published for journalistic purposes (and not other purposes such as voyeuristic purposes) and (ii) the limitation to the right to privacy of the police officers was necessary to uphold the claimant’s right to freedom of expression.

The GDPR contains a similar exception for the use of personal data for journalistic purposes, so the outcome of this decision would likely be quite similar under the GDPR.

EDPB releases information note in the event of a “No-deal Brexit”

On February 12, 2019, the European Data Protection Board (“EDPB”) published two information notes to highlight the impact of a so-called “No-deal Brexit” on data transfers under the EU General Data Protection Regulation (“GDPR”), as well as the impact on organizations that have selected the UK Information Commissioner (“ICO”) as their “lead supervisory authority” for their “Binding Corporate Rules” (“BCRs”).

In the “No-deal” scenario, the United Kingdom would leave the European Union on  March 29, 2019 without having agreed the terms for the departure with the latter, a contingency that increasingly appears likely as attempts by the UK Government to secure consensus in the UK Parliament on the Withdrawal Agreement continue to falter.

Information note on data transfers under the GDPR in the event of a “No-deal Brexit”

 In its first note, the EDPB reminds organizations that in the event of a “No-deal Brexit”, transfers to the UK from the EU will need to involve the use of one of the traditional data transfer mechanisms arising under the GDPR, at least until such time as the UK receives a formal adequacy determination from the EU.  The EDPB use the note to walk through the various options, including use of standard (or ad hoc) data protection clauses, “Binding Corporate Rules” and derogations, stopping only to note that the use of derogations should be a last resort.  Public authorities, unlike private enterprises, can avail themselves of additional options, including administrative, bilateral or multilateral agreements, where those are legally binding and enforceable, as well as “administrative arrangements” meeting certain requirements.  The EDPB is not the only EU regulatory body concerning itself with the prospect of a No-deal Brexit.  The ICO itself has released extensive guidance for organizations to help plan ahead for such a contingency, discussing data transfer considerations among others.

 Information note on BCRs for companies which have the ICO as BCR “lead supervisory authority”

 In its second note, the EDPB recommends that organizations take certain measures in the event that the ICO can no longer serve as a “lead,” which is one consequence of a “No-deal Brexit.”  As background, BCRS are a mechanism by which organizations may lawfully convey personal data from the EU to affiliates outside the EU, provided those affiliates agree to comply with a set of privacy principles and rules now codified under Article 47 of the EU General Data Protection Regulation (“GDPR”).

Organizations seeking to adopt BCRs must submit to an process that begins with designating a “lead supervisory authority,” identified on the basis of particular criteria, and then proceeds to negotiations with the authority (potentially aided by additional authorities) over the content of the BCRs.  Once the BCR terms are agreed, a “consistency mechanism” is triggered, whereby the EDPB will issue an opinion on the BCRs within a stipulated period of time.  If favorable, it will result in the BCRs being approved for use in the EU.

Meanwhile, organizations that secured approval for their BCRs under the EU’s pre-GDPR regime have been updating their BCRs over the past year to bring them into compliance with recent regulatory guidance papers, notably WP 256 (Working Document setting up a table with the elements and principles to be found in Binding Corporate Rules).  The Article 29 Working Party, now superseded by the EDPB, issued its guidance to align these legacy BCRs with the changes brought about by the GDPR.  Further complicating matters, a number of organizations whose BCRs are supervised by the ICO as lead authority have been motivated by the prospects of a “No-deal Brexit” to transition their BCRs to another EU lead authority, presenting their BCRs to regulators in other EU Member States, with Ireland a clear favorite.

What are the EDPB’s recommendations?

In its information note, the EDPB observe that in the event of a “No-deal Brexit,” the ICO can no longer serve as a “lead,” or even backstop reviewer, for EU BCRs.  The EDPB distinguish between two scenarios:  where an ICO-led BCR application is pending and where an ICO-led application has been approved.

In the first scenario, organizations that have submitted their BCRs to the ICO for review, but have not yet completed the review process, will need to identify a new “lead supervisory authority,” applying criteria set forth in Working Document 263, adopted by the Article 29 Working Party in April 2018.  This includes assessing:

  • where the organization maintains its European headquarters or which EU affiliate has delegated its data protection responsibilities;
  • which EU affiliates could oversee and enforce the BCRs or issue decisions in relation to EU data processing; or
  • which affiliates are involved in data transfers from the EU.

This “new” lead authority will then assume responsibility for the organization’s BCRs, “initiate a new procedure” with the organization, and ultimately submit the BCRs to the EDPB under the GDPR’s “consistency mechanism.”  If, however, the BCR application is already before the EDPB when a “No-deal Brexit” occurs, the organization still will need to designate a new “lead authority” to replace the ICO.  This authority then will “resubmit” the application to the EDPB, seemingly resetting the EDPB’s 8-week deadline for evaluating the application.

Finally, where organizations already have BCRs that have been approved by the ICO, under the pre-GDPR regime, the EDPB cryptically states that they will need to identify a new lead supervisory authority in order to maintain the effectiveness of their BCRs as a data transfer mechanism.  This undoubtedly will spur many organizations to proceed apace in transitioning their BCRs to a new “lead” authority ahead of  March 29, 2019.

Defense Department Releases Artificial Intelligence Strategy

(This article was originally published in Global Policy Watch.)

On February 12, 2019 the Department of Defense released a summary and supplementary fact sheet of its artificial intelligence strategy (“AI Strategy”). The AI Strategy has been a couple of years in the making as the Trump administration has scrutinized the relative investments and advancements in artificial intelligence by the United States, its allies and partners, and potential strategic competitors such as China and Russia. The animating concern was articulated in the Trump administration’s National Defense Strategy (“NDS”): strategic competitors such as China and Russia has made investments in technological modernization, including artificial intelligence, and conventional military capability that is eroding U.S. military advantage and changing how we think about conventional deterrence. As the NDS states, “[t]he reemergence of long-term strategic competition, rapid dispersion of technologies” such as “advanced computing, “big data” analytics, artificial intelligence” and others will be necessary to “ensure we will be able to fight and win the wars of the future.”

The AI Strategy offers that “[t]he United States, together with its allies and partners, must adopt AI to maintain its strategic position, prevail on future battlefields, and safeguard [a free and open international] order. We will also seek to develop and use AI technologies in ways that advance security, peace, and stability in the long run. We will lead in the responsible use and development of AI by articulating our vision and guiding principles for using AI in a lawful and ethical manner.”

DoD will implement the AI Strategy through five main lines of effort:

  • Delivering AI-enabled capabilities that address key missions
  • Scaling AI’s impact across DOD through a common foundation that enables decentralized development and experimentation
  • Cultivating a leading AI workforce
  • Engaging with commercial, academic, and international allies and partners
  • Leading in military ethics and AI safety

The AI Strategy emphasizes that “[f]ailure to adopt AI will result in legacy systems irrelevant to the defense of our people, eroding cohesion among allies and partners, reduced access to markets that will contribute to a decline in our prosperity and standard of living, and growing challenges to societies that have been built upon individual freedoms.”

The Joint Artificial Intelligence Center (“JAIC”), which was established in June 2018, is led by Lt. Gen. Jack Shanahan and reports to the DoD Chief Information Officer Dana Deasy.  It is designated as the principal implementer and integrator of the AI Strategy. Specifically, the JAIC will coordinate activities that align with DoD’s strategic approach, such as: (1) rapidly delivering AI-enabled capabilities; (2) establishing a common foundation for scaling AI’s impact across DoD; (3) facilitating AI planning, policy, governance, ethics, safety, cybersecurity, and multilateral coordination; and (4) attracting and cultivating world-class personnel.

The AI Strategy makes clear that DoD recognizes that “[t]he present moment is pivotal: we must act to protect our security and advance our competiveness, seizing the initiative to lead the world in the development and adoption of transformative defense AI solutions that are safe, ethical, and secure. JAIC will spearhead this effort, engaging with the best minds in government, the private sector, academia, and international community. The speed and scale of the change required are daunting, but we must embrace change if we are to reap the benefits of continued security and prosperity for the future.” Accordingly, Lt. Gen. Shanahan and Dana Deasy, speaking to a group of reporters, highlighted that DoD has recently invested $90 million in AI-related research and technology development, and that DoD will request additional resources for the JAIC in its fiscal year 2020 budget request in order to support its execution of the AI Strategy.

The DoD strategy comes on the heels of President Trump’s Executive Order (“EO”), “Maintaining American Leadership in Artificial Intelligence,” that launches a coordinated federal government strategy for artificial intelligence. The EO directs federal departments and agencies to invest the resources necessary to drive technological breakthroughs in AI (and outpace China’s developments in this area), lead the develop of global technical standards, address workforce issues as industries adopt AI, foster trust in AI technologies, and promote U.S. research and innovation with allies and partners.