This quarterly update summarizes key federal legislative and regulatory developments in the second quarter of 2022 related to artificial intelligence (“AI”), the Internet of Things (“IoT”), connected and automated vehicles (“CAVs”), and data privacy, and highlights a few particularly notable developments in U.S. state legislatures.  In the second quarter of 2022, Congress and the Administration focused on addressing algorithmic bias and other AI-related risks and introduced a bipartisan federal privacy bill.

Artificial Intelligence

Federal lawmakers introduced legislation in the second quarter of 2022 aimed at addressing risks in the development and use of AI systems, in particular risks related to algorithmic bias and discrimination.  Senator Michael Bennet (D-CO) introduced the Digital Platform Commission Act of 2022 (S. 4201), which would empower a new federal agency, the Federal Digital Platform Commission, to develop regulations for online platforms that facilitate interactions between consumers, as well as between consumers and entities offering goods and services.  Regulations contemplated by the bill include requirements that algorithms used by online platforms “are fair, transparent, and without harmful, abusive, anticompetitive, or deceptive bias.”  Although this bill does not appear to have the support to be passed in this Congress, it is emblematic of the concerns in Congress that might later lead to legislation.

Additionally, the bipartisan American Data Privacy and Protection Act (H.R. 8152), introduced by a group of lawmakers led by Representative Frank Pallone (D-NJ-6), would require “large data holders” (defined as covered entities and service providers with over $250 million in gross annual revenue that collect, process, or transfer the covered data of over five million individuals or the sensitive covered data of over 200,000 individuals) to conduct “algorithm impact assessments” on algorithms that “may cause potential harm to an individual.”  These assessments would be required to provide, among other information, details about the design of the algorithm and the steps the entity is taking to mitigate harms to individuals.  Separately, developers of algorithms would be required to conduct “algorithm design evaluations” that evaluate the design, structure, and inputs of the algorithm.  The American Data Privacy and Protection Act is discussed in further detail in the Data Privacy section below.

Federal regulatory agencies also released reports focused on algorithmic bias and other risks arising from AI systems.  The National Institute of Standards and Technology (“NIST”) released for public comment an initial draft of its AI Risk Management Framework, which provides guidance for managing risks in the design, development, use, and evaluation of AI systems.  NIST separately released a document titled, “Towards a Standard for Identifying and Managing Bias within Artificial Intelligence,” which aims to provide guidance for mitigating harmful bias in AI systems.  In addition to these two NIST reports, the Department of Justice and Equal Employment Opportunity Commission each released guidance documents explaining how the misuse of algorithms and AI during the hiring process can lead to disability discrimination that violates the Americans with Disabilities Act.  These reports provide businesses with guidance related to the development and deployment of AI.

Internet of Things

Federal lawmakers, in the second quarter of 2022, introduced legislation to bolster Internet of Things competencies at the Small Business Administration (“SBA”).  Specifically, on March 23, 2022, Senators Jeanne Shaheen and John Kennedy introduced the Small Business Broadband and Emerging Information Technology Enhancement Act of 2022 (S. 3906).  This bipartisan bill would direct the SBA to designate a broadband and emerging information technology coordinator, where “emerging information technology” includes Internet of Things technology.  The bill would also provide for training on emerging information technology for SBA staff, as well as direct the coordinator to submit a report to Congress regarding emerging technology information activities.  On May 25, 2022, S. 3906 was favorably reported out of the Senate Committee on Small Business and Entrepreneurship and now awaits consideration by the full Senate.

Additionally, this quarter, federal regulators continued to engage with IoT-related policy across the federal government, particularly at the Food and Drug Administration (“FDA”) and the National Institute of Standards and Technology (“NIST”).  The FDA sought comment on its April draft guidance document entitled “Cybersecurity in Medical Devices: Quality System Considerations and Content of Premarket Submissions.”  The draft guidance provides recommendations to industry regarding “cybersecurity device design, labeling, and the documentation that FDA recommends be included in premarket submissions for devices with cybersecurity risk” including Internet- and network-connected devices, and this development signals continued interest by federal regulators in the development and deployment of IoT devices.

In May, NIST published a report on hardware-enabled security, which discusses a layered and coherent approach to platform security for cloud and edge computing use cases in the context of the rise of industrialized hacking, and provides “hardware-enabled security techniques and technologies that can improve platform security and data protection for cloud data centers and edge computing.”  NIST also released a summary report for its review of pilot programs established pursuant to E.O. 14028 dedicated to the labeling of consumer IoT devices and consumer software development practices.  According to the report, the pilot revealed a clear interest in consumer cybersecurity labeling programs for IoT products and software, though challenges remain, specifically around the need for consistency in labels, consumer education and awareness, flexibility, and coordination across stakeholders, as well as concerns with liability of key stakeholders.  Finally, on June 17, NIST issued an initial public draft of a Profile of the IoT Core Baseline for Consumer IoT Products (NIST IR 8425).  The draft adopts and applies the cybersecurity labeling criteria that emerged from NIST’s February 2022 white paper and incorporates into the family of NIST’s IoT cybersecurity guidance.  These NIST reports provide businesses with important guidance on how to safely and securely develop and deploy IoT devices.  Although the NIST guidance is not legally binding, it signals a best practice that may later be incorporated by lawmakers in legislation.

Connected and Autonomous Vehicles (“CAVs”)

In April, Secretary of Transportation Pete Buttigieg faced pressure to develop a comprehensive federal framework for CAVs.  At a May 3, 2022 hearing before the Senate Commerce Committee on the Department of Transportation’s (DOT) Fiscal Year 2023 Budget Priorities, Secretary Buttigieg seemingly responded to that pressure by testifying as to CAV policy looking ahead.  Secretary Buttigieg stated that the DOT is working to make sure that the Department’s regulations keep pace with CAV technology advancements, but recognized that the Department could be doing more to support CAV development and deployment.  Notably, DOT’s FY2023 Budget Highlights included funding requests for a number of CAV-related programs, which could aid in advancement and deployment of CAVs.  Buttigieg stressed, however, the importance of national CAV legislation from Congress to make meaningful strides here.

On June 15, 2022, DOT’s National Highway and Traffic Safety Administration (“NHTSA”) released the results of almost a year of crash reporting following its June 2021 standing general order requiring crash reporting data to be submitted within 24 hours for vehicles equipped with ADAS and ADS.  NHTSA divided the data into two reports: one for SAE Level 2 (e.g., Advanced Driver Assistance Systems, or “ADAS”), and one for SAE Levels 3-4 (e.g., Automated Driving Systems, or “ADS”).  Between June 29, 2021 and May 15, 2022, NHTSA received reports on 392 crashes involving Level 2 vehicles equipped with ADAS, and 130 reports of crashes involving vehicles equipped with ADS at levels 3 and 4.  Changes to the reporting system may be on the horizon, as NHTSA noted the quality of data was affected by how the vehicles record and transmit information (e.g., the same crash may have multiple reports), and much of the data was incomplete.

Finally, in 2018, the Federal Transit Administration (“FTA”) completed a five-year Strategic Transit Automation Research Plan (“STAR” Plan).  In preparation for the STAR Plan’s expiration date, FTA issued a request for information (“RFI”) on June 2, 2022.  The RFI seeks input from public and industry stakeholders on the next phase of research, collaboration and engagement, technology development, and demonstration of ADS or ADAS necessary to improve the safe, efficient, equitable and climate-friendly provision of public transportation and sustain the associated workforce.  Comments received through this RFI will inform FTA’s development of STAR Plan 2.0.  Comments are requested by August 1, 2022.

Data Privacy

In a significant update this Quarter, members of Congress introduced the bipartisan American Data Privacy and Protection Act (H.R. 8512), which presents a comprehensive federal privacy framework for the United States.  The bill would apply to “covered entities,” defined as those entities that “alone or jointly with others” determine the purposes and means of collecting, processing, and transferring covered data and are (1) subject to the FTC Act, (2) common carriers under federal law, or (3) nonprofits.  Those entities would be prohibited from collecting, processing, or transferring covered data unless it is limited “to what is reasonably necessary and proportionate” to (1) provide or maintain a specific product or service requested by the individual to whom the data pertains, (2) deliver a communication that is reasonably anticipated by the individual recipient in the context of the individual recipient’s interactions with the covered entity, or (3) for one of the enumerated “permissible purposes” in the bill’s text. 

The bill would further prohibit the collection and processing of certain particularly sensitive data “except where such collection or processing is strictly necessary to provide or maintain a specific product or service requested by an individual to whom the covered data pertains” or to effectuate one of the permitted purposes.  It also would provide consumer rights of access, deletion, correction, and portability, and a right to object to the “transfer” (defined broadly to encompass activities beyond a sale) of covered data.  Finally, the American Data Privacy Protection Act would impose requirements on relationships between covered entities and services providers and third parties, including requirements for contractual terms, and requires covered entities to implement certain accountability measures, like the appointment of data privacy and security officers.

Notably, the issues of a private right of action and preemption have long stalled Congress in passing a federal privacy bill.  The American Data Privacy and Protection Act attempts to resolve those issues with a broad preemption provision that includes several carve-outs for certain state statutes, including state consumer protection statutes, state data breach laws, and the Illinois Biometric Privacy Act. The FTC, State Attorneys General, and private plaintiffs would all be able to enforce the bill’s requirements.  For more on this legislation, see here.

After years of negotiations over a federal privacy legislation, this proposal is the most significant bipartisan effort behind a comprehensive privacy bill to date.  Despite the notable agreement between Republicans and Democrats, however, several key stakeholders remain opposed to the bill’s approach to preemption and a private right of action, and it would be difficult to pass the bill this Congress without their support.  Moreover, the time for legislative action this Congress is dwindling.  Although it is increasingly unlikely that the American Data Privacy and Protection Act will pass in this Congress, the framework will likely serve as a template in the next Congress to continue the effort.  For more on this bill and the dynamics in Congress at play, see here.

There has also been continued privacy legislative activity at the state level.  In May, Connecticut enacted comprehensive data privacy legislation.  The law generally follows the framework set forth by state privacy laws in Virginia, Colorado, and Utah, with some differences in the details.   

We will continue to update you on meaningful developments in these quarterly updates and across our blogs.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Jennifer Johnson Jennifer Johnson

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors…

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors, television companies, trade associations, and other entities on a wide range of media and technology matters. Jennifer has almost three decades of experience advising clients in the communications, media and technology sectors, and has held leadership roles in these practices for almost twenty years. On technology issues, she collaborates with Covington’s global, multi-disciplinary team to assist companies navigating the complex statutory and regulatory constructs surrounding this evolving area, including product counseling and technology transactions related to connected and autonomous vehicles, internet connected devices, artificial intelligence, smart ecosystems, and other IoT products and services. Jennifer serves on the Board of Editors of The Journal of Robotics, Artificial Intelligence & Law.

Jennifer assists clients in developing and pursuing strategic business and policy objectives before the Federal Communications Commission (FCC) and Congress and through transactions and other business arrangements. She regularly advises clients on FCC regulatory matters and advocates frequently before the FCC. Jennifer has extensive experience negotiating content acquisition and distribution agreements for media and technology companies, including program distribution agreements, network affiliation and other program rights agreements, and agreements providing for the aggregation and distribution of content on over-the-top app-based platforms. She also assists investment clients in structuring, evaluating, and pursuing potential investments in media and technology companies.

Photo of Nicholas Xenakis Nicholas Xenakis

Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal…

Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal justice.

Nick joined the firm’s Public Policy practice after serving most recently as Chief Counsel for Senator Dianne Feinstein (D-CA) and Staff Director of the Senate Judiciary Committee’s Human Rights and the Law Subcommittee, where he was responsible for managing the subcommittee and Senator Feinstein’s Judiciary staff. He also advised the Senator on all nominations, legislation, and oversight matters before the committee.

Previously, Nick was the General Counsel for the Senate Judiciary Committee, where he managed committee staff and directed legislative and policy efforts on all issues in the Committee’s jurisdiction. He also participated in key judicial and Cabinet confirmations, including of an Attorney General and two Supreme Court Justices. Nick was also responsible for managing a broad range of committee equities in larger legislation, including appropriations, COVID-relief packages, and the National Defense Authorization Act.

Before his time on Capitol Hill, Nick served as an attorney with the Federal Public Defender’s Office for the Eastern District of Virginia. There he represented indigent clients charged with misdemeanor, felony, and capital offenses in federal court throughout all stages of litigation, including trial and appeal. He also coordinated district-wide habeas litigation following the Supreme Court’s decision in Johnson v. United States (invalidating the residual clause of the Armed Career Criminal Act).

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder counsels national and multinational companies across industries on data privacy, cybersecurity, and emerging technologies, including Artificial Intelligence and Internet of Things.

In particular, Jayne advises clients on compliance with federal, state, and global privacy frameworks, and counsels clients on navigating the…

Jayne Ponder counsels national and multinational companies across industries on data privacy, cybersecurity, and emerging technologies, including Artificial Intelligence and Internet of Things.

In particular, Jayne advises clients on compliance with federal, state, and global privacy frameworks, and counsels clients on navigating the rapidly evolving legal landscape. Her practice includes partnering with clients on the design of new products and services, drafting and negotiating privacy terms with vendors and third parties, developing privacy notices and consent forms, and helping clients design governance programs for the development and deployment of Artificial Intelligence and Internet of Things technologies.

Jayne routinely represents clients in privacy and consumer protection enforcement actions brought by the Federal Trade Commission and state attorneys general, including related to data privacy and advertising topics. She also helps clients articulate their perspectives through the rulemaking processes led by state regulators and privacy agencies.

As part of her practice, Jayne advises companies on cybersecurity incident preparedness and response, including by drafting, revising, and testing incident response plans, conducting cybersecurity gap assessments, engaging vendors, and analyzing obligations under breach notification laws following an incident.

Photo of Olivia Dworkin Olivia Dworkin

Olivia Dworkin minimizes regulatory and litigation risks for clients in the medical device, pharmaceutical, biotechnology, eCommerce, and digital health industries through strategic advice on complex FDA issues, helping to bring innovative products to market while ensuring regulatory compliance.

With a focus on cutting-edge…

Olivia Dworkin minimizes regulatory and litigation risks for clients in the medical device, pharmaceutical, biotechnology, eCommerce, and digital health industries through strategic advice on complex FDA issues, helping to bring innovative products to market while ensuring regulatory compliance.

With a focus on cutting-edge medical technologies and digital health products and services, Olivia regularly helps new and established companies navigate a variety of state and federal regulatory, legislative, and compliance matters throughout the total product lifecycle. She has experience counseling clients on the development, FDA regulatory classification, and commercialization of digital health tools, including clinical decision support software, mobile medical applications, general wellness products, medical device data systems, administrative support software, and products that incorporate artificial intelligence, machine learning, and other emerging technologies.

Olivia also assists clients in advocating for legislative and regulatory policies that will support innovation and the safe deployment of digital health tools, including by drafting comments on proposed legislation, frameworks, whitepapers, and guidance documents. Olivia keeps close to the evolving regulatory landscape and is a frequent contributor to Covington’s Digital Health blog. Her work also has been featured in the Journal of Robotics, Artificial Intelligence & Law, Law360, and the Michigan Journal of Law and Mobility.

Prior to joining Covington, Olivia was a fellow at the University of Michigan Veterans Legal Clinic, where she gained valuable experience as the lead attorney successfully representing clients at case evaluations, mediations, and motion hearings. At Michigan Law, Olivia served as Online Editor of the Michigan Journal of Gender and Law, president of the Trial Advocacy Society, and president of the Michigan Law Mock Trial Team. She excelled in national mock trial competitions, earning two Medals for Excellence in Advocacy from the American College of Trial Lawyers and being selected as one of the top sixteen advocates in the country for an elite, invitation-only mock trial tournament.