This quarterly update summarizes key legislative and regulatory developments in the second quarter of 2023 related to key technologies and related topics, including Artificial Intelligence (“AI”), the Internet of Things (“IoT”), connected and automated vehicles (“CAVs”), data privacy and cybersecurity, and online teen safety.

Artificial Intelligence

AI continued to be an area of significant interest of both lawmakers and regulators throughout the second quarter of 2023.  Members of Congress continue to grapple with ways to address risks posed by AI and have held hearings, made public statements, and introduced legislation to regulate AI.  Notably, Senator Chuck Schumer (D-NY) revealed his “SAFE Innovation framework” for AI legislation.  The framework reflects five principles for AI – security, accountability, foundations, explainability, and innovation – and is summarized here.  There were also a number of AI legislative proposals introduced this quarter.  Some proposals, like the National AI Commission Act (H.R. 4223) and Digital Platform Commission Act (S. 1671), propose the creation of an agency or commission to review and regulate AI tools and systems.  Other proposals focus on mandating disclosures of AI systems.  For example, the AI Disclosure Act of 2023 (H.R. 3831) would require generative AI systems to include a specific disclaimer on any outputs generated, and the REAL Political Advertisements Act (S. 1596) would require political advertisements to include a statement within the contents of the advertisement if generative AI was used to generate any image or video footage.  Additionally, Congress convened hearings to explore AI regulation this quarter, including a Senate Judiciary Committee Hearing in May titled “Oversight of A.I.: Rules for Artificial Intelligence.”

There also were several federal Executive Branch and regulatory developments focused on AI in the second quarter of 2023, including, for example:

  • White House:  The White House issued a number of updates on AI this quarter, including the Office of Science and Technology Policy’s strategic plan focused on federal AI research and development, discussed in greater detail here.  The White House also requested comments on the use of automated tools in the workplace, including a request for feedback on tools to surveil, monitor, evaluate, and manage workers, described here.
  • CFPB:  The Consumer Financial Protection Bureau (“CFPB”) issued a spotlight on the adoption and use of chatbots by financial institutions.
  • FTC:  The Federal Trade Commission (“FTC”) continued to issue guidance on AI, such as guidance expressing the FTC’s view that dark patterns extend to AI, that generative AI poses competition concerns, and that tools claiming to spot AI-generated content must make accurate disclosures of their abilities and limitations.
  • HHS Office of National Coordinator for Health IT:  This quarter, the Department of Health and Human Services (“HHS”) released a proposed rule related to certified health IT that enables or interfaces with “predictive decision support interventions” (“DSIs”) that incorporate AI and machine learning technologies.  The proposed rule would require the disclosure of certain information about predictive DSIs to enable users to evaluate DSI quality and whether and how to rely on the DSI recommendations, including a description of the development and validation of the DSI.  Developers of certified health IT would also be required to implement risk management practices for predictive DSIs and make summary information about these practices publicly available.

Internet of Things

IoT developments in the second quarter of 2023 focused on children.  For example, Representation Kathy Castor (D-FL-14) introduced the Kids PRIVACY Act (H.R. 2801), which would amend the Children’s Online Privacy Protection Act (“COPPA”) to update and expand its coverage to encompass internet connected devices, including by requiring that entities subject to COPPA maintain appropriate safeguards for internet connected devices.  

Connected and Automated Vehicles

The National Highway Traffic Safety Administration (“NHTSA”) had a busy quarter.  On April 6, 2023, NHTSA published a notice and request for comments on its intention to request approval from the Office of Management and Budget for an extension of its information collection efforts under its Automated Vehicle Transparency Engagement for Safe Testing (“AV TEST”) Initiative, which involves the voluntary collection of information from entities testing vehicles equipped with automated driving systems (“ADS”) and from states and local authorities involved in the regulation of ADS testing.  The request is to extend information collection efforts under the AV TEST Initiative for three additional years (beginning from the date of approval).  The comment period closed on June 5, 2023.  Additionally, on June 13, 2023, NHTSA issued a notice of proposed rulemaking proposing to adopt a new Federal Motor Vehicle Safety Standard to require automatic emergency braking (“AEB”), including pedestrian AEB, systems on light vehicles.  This proposal would require that AEB requirements be phased in within four years of publication of a final rule.  Comments are due on or before August 14, 2023.

Also on June 13, NHTSA sent a letter to 22 automakers instructing them not to comply with a Right to Repair law recently enacted in Massachusetts, citing “significant safety concerns” such as vehicle crashes, injuries, or deaths.  The law requires automakers to allow access to their vehicles’ data so owners can get their vehicles fixed by independent repairers (rather than dealership service centers) if they choose.  The letter from NHTSA claims the Massachusetts law is in conflict with, and is therefore preempted by, the Motor Vehicle Safety Act.  NHTSA alleges that “[o]pen access to vehicle manufacturers’ telematics offerings with the ability to remotely send commands allows for manipulation of systems on a vehicle, including safety-critical functions such as steering, acceleration or braking, as well as equipment required by Federal Motor Vehicle Safety Standards such as air bags and electronic stability control.”

There also were developments at the Federal Communications Commission (“FCC”) relating to Cellular Vehicle-to-Everything (“C-V2X”) technology.  C-V2X technology is expected to increase safety and reduce traffic congestion by enabling vehicles to sense and communicate with each other and with other devices.  In April, the FCC approved a joint waiver request that will allow certain entities to start deploying C-V2X technology. 

Finally, building off of a 2022 Autonomous Vehicles Workshop, NIST announced that it will be holding a follow-on virtual workshop entitled “Standards and Performance Metrics for On-Road Automated Vehicles” from September 5-8, 2023.  The purpose of this workshop is to update the CAV community on NIST’s recent work in the area, provide a forum to provide feedback, and set a path forward to ensure that NIST’s efforts in developing standards and performance metrics provide the greatest value to the community.  Registration is free and now open.

Privacy & Cybersecurity

There continued to be minimal traction in Congress to advance a privacy framework in the second quarter of 2023.  Despite various promises to reintroduce the American Data Privacy Protection Act (“ADPPA”), the ADPPA has yet to be reintroduced this Congress.  Nevertheless, in late April, Representatives Anna Eshoo (D-CA-16) and Zoe Lofgren (D-CA-18) reintroduced the Online Privacy Act (H.R. 2701)—a comprehensive privacy bill that would create consumer rights, privacy notice requirements, information security requirements, and a digital privacy agency.  Federal lawmakers continue to express interest in a children’s privacy framework, as Senator Ed Markey introduced (D-MA) the Children and Teens’ Online Privacy and Protection Act (“COPPA 2.0”) (S. 1418), though the bill has not advanced.  

In contrast, there continued to be significant legislative focus on comprehensive privacy legislation at the state level.  State legislatures have passed a number of comprehensive privacy frameworks this quarter, including in Oregon, Texas, Indiana, and Delaware. Washington and Nevada passed consumer health data privacy bills.  Additionally, the Connecticut legislature passed amendments to the Connecticut Data Privacy Act, which add provisions related to health data and minors’ data.  

Although Congress did not advance significant cybersecurity legislation this quarter, there was significant regulatory activity.  The Securities and Exchange Commission (“SEC”) published an update to its rulemaking agenda, indicating that its previously-proposed cyber rules addressing disclosure requirements for incidents at publicly traded companies and registered investment advisors and funds may not be approved until October 2023.  Additional detail on this update is available here.  Additionally, the Cybersecurity and Infrastructure Security Agency released guidance on Security-by-Design and Security-by-Default principles for technology manufacturers.  The guidance was developed in coordination with the Federal Bureau of Investigation and the National Security Agency, as well as cybersecurity authorities in Australia, Canada, United Kingdom, Germany, Netherlands, and New Zealand.

Online Teen Safety

This quarter saw the introduction of a number of child and teen online safety bills at both the state and federal level.  In particular, lawmakers focused on age verification, parental consent, and heightened obligations for social media platform with users under the age of 18.  For example, in April 2023, Governor Huckabee Sanders (R-AR) signed into law the Social Media Safety Act.  The bill’s passing made Arkansas the second state to enact broad restrictions on social media use of minors, following Utah’s passing of the Social Media Regulation Act in March 2023.  The law, which takes effect on September 1, prohibits covered social media companies from permitting individuals under 18 from creating an account without the express consent of their parent or legal guardian and requires covered social media companies to verify the age of new users.  In Congress, a bi-partisan coalition led by Senator Brian Schatz (D-HI), introduced the Protecting Kids on Social Media Act (S. 1291), which would require social media platforms to verify the age of their users, mandate parental consent for under 18 users, and prohibit the use of algorithmic recommendation systems on individuals under 18.

We will continue to update you on meaningful developments in these quarterly updates and across our blogs.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Jorge Ortiz Jorge Ortiz

Jorge Ortiz is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity and the Technology and Communications Regulation Practice Groups.

Jorge advises clients on a broad range of privacy and cybersecurity issues, including topics related to…

Jorge Ortiz is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity and the Technology and Communications Regulation Practice Groups.

Jorge advises clients on a broad range of privacy and cybersecurity issues, including topics related to privacy policies and compliance obligations under U.S. state privacy regulations like the California Consumer Privacy Act.

Photo of Shayan Karbassi Shayan Karbassi

Shayan Karbassi is an associate in the firm’s Washington, DC office. He represents and advises clients on a range of cybersecurity and national security issues. As a part of his cybersecurity practice, Shayan assists clients with cyber and data security incident response and…

Shayan Karbassi is an associate in the firm’s Washington, DC office. He represents and advises clients on a range of cybersecurity and national security issues. As a part of his cybersecurity practice, Shayan assists clients with cyber and data security incident response and preparedness, government and internal investigations, and regulatory compliance. He also regularly advises clients with respect to risks stemming from U.S. criminal and civil anti-terrorism laws and other national security issues, to include investigating allegations of terrorism-financing and litigating Anti-Terrorism Act claims.

Shayan maintains an active pro bono litigation practice with a focus on human rights, freedom of information, and free media issues.

Prior to joining the firm, Shayan worked in the U.S. national security community.

Photo of Olivia Dworkin Olivia Dworkin

Olivia Dworkin minimizes regulatory and litigation risks for clients in the medical device, pharmaceutical, biotechnology, eCommerce, and digital health industries through strategic advice on complex FDA issues, helping to bring innovative products to market while ensuring regulatory compliance.

With a focus on cutting-edge…

Olivia Dworkin minimizes regulatory and litigation risks for clients in the medical device, pharmaceutical, biotechnology, eCommerce, and digital health industries through strategic advice on complex FDA issues, helping to bring innovative products to market while ensuring regulatory compliance.

With a focus on cutting-edge medical technologies and digital health products and services, Olivia regularly helps new and established companies navigate a variety of state and federal regulatory, legislative, and compliance matters throughout the total product lifecycle. She has experience counseling clients on the development, FDA regulatory classification, and commercialization of digital health tools, including clinical decision support software, mobile medical applications, general wellness products, medical device data systems, administrative support software, and products that incorporate artificial intelligence, machine learning, and other emerging technologies.

Olivia also assists clients in advocating for legislative and regulatory policies that will support innovation and the safe deployment of digital health tools, including by drafting comments on proposed legislation, frameworks, whitepapers, and guidance documents. Olivia keeps close to the evolving regulatory landscape and is a frequent contributor to Covington’s Digital Health blog. Her work also has been featured in the Journal of Robotics, Artificial Intelligence & Law, Law360, and the Michigan Journal of Law and Mobility.

Prior to joining Covington, Olivia was a fellow at the University of Michigan Veterans Legal Clinic, where she gained valuable experience as the lead attorney successfully representing clients at case evaluations, mediations, and motion hearings. At Michigan Law, Olivia served as Online Editor of the Michigan Journal of Gender and Law, president of the Trial Advocacy Society, and president of the Michigan Law Mock Trial Team. She excelled in national mock trial competitions, earning two Medals for Excellence in Advocacy from the American College of Trial Lawyers and being selected as one of the top sixteen advocates in the country for an elite, invitation-only mock trial tournament.

Photo of Hensey A. Fenton III Hensey A. Fenton III

Hensey Fenton specializes in providing advice and guidance to clients on legislative and regulatory strategies. Hensey counsels clients on a myriad of issues in the policy and regulatory space, including issues involving cybersecurity, financial services, artificial intelligence, digital assets, international trade and development…

Hensey Fenton specializes in providing advice and guidance to clients on legislative and regulatory strategies. Hensey counsels clients on a myriad of issues in the policy and regulatory space, including issues involving cybersecurity, financial services, artificial intelligence, digital assets, international trade and development, and tax.

Another facet of Hensey’s practice involves cutting-edge legal issues in the cybersecurity space. Having published scholarly work in the areas of cybersecurity and cyberwarfare, Hensey keeps his finger on the pulse of this fast-developing legal field. His Duke Journal of Comparative & International Law article, “Proportionality and its Applicability in the Realm of Cyber Attacks,” was highlighted by the Rutgers Computer and Technology Law Journal as one of the most important and timely articles on cyber, technology and the law. Hensey counsels clients on preparing for and responding to cyber-based attacks. He regularly engages with government and military leaders to develop national and global strategies for complex cyber issues and policy challenges.

Hensey’s practice also includes advising international clients on various policy, legal and regulatory challenges, especially those challenges facing developing nations in the Middle East. Armed with a distinct expertise in Middle Eastern foreign policy and the Arabic language, Hensey brings a multi-faceted approach to his practice, recognizing the specific policy and regulatory concerns facing clients in the region.

Hensey is also at the forefront of important issues involving Diversity, Equity and Inclusion (DEI). He assists companies in developing inclusive and sustainable DEI strategies that align with and incorporate core company values and business goals.

Prior to joining Covington, Hensey served as a Judicial Law Clerk for the Honorable Judge Johnnie B. Rawlinson, United States Court of Appeals for the Ninth Circuit. He also served as a Diplomatic Fellow in the Kurdistan Regional Government’s Representation (i.e. Embassy) in Washington, DC.

Photo of Jemie Fofanah Jemie Fofanah

Jemie Fofanah is an associate in the firm’s Washington, DC office. She is a member of the Privacy and Cybersecurity Practice Group and the Technology and Communication Regulatory Practice Group. She also maintains an active pro bono practice with a focus on criminal…

Jemie Fofanah is an associate in the firm’s Washington, DC office. She is a member of the Privacy and Cybersecurity Practice Group and the Technology and Communication Regulatory Practice Group. She also maintains an active pro bono practice with a focus on criminal defense and family law.

Photo of Jennifer Johnson Jennifer Johnson

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors…

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors, television companies, trade associations, and other entities on a wide range of media and technology matters. Jennifer has almost three decades of experience advising clients in the communications, media and technology sectors, and has held leadership roles in these practices for almost twenty years. On technology issues, she collaborates with Covington’s global, multi-disciplinary team to assist companies navigating the complex statutory and regulatory constructs surrounding this evolving area, including product counseling and technology transactions related to connected and autonomous vehicles, internet connected devices, artificial intelligence, smart ecosystems, and other IoT products and services. Jennifer serves on the Board of Editors of The Journal of Robotics, Artificial Intelligence & Law.

Jennifer assists clients in developing and pursuing strategic business and policy objectives before the Federal Communications Commission (FCC) and Congress and through transactions and other business arrangements. She regularly advises clients on FCC regulatory matters and advocates frequently before the FCC. Jennifer has extensive experience negotiating content acquisition and distribution agreements for media and technology companies, including program distribution agreements, network affiliation and other program rights agreements, and agreements providing for the aggregation and distribution of content on over-the-top app-based platforms. She also assists investment clients in structuring, evaluating, and pursuing potential investments in media and technology companies.

Photo of Nicholas Xenakis Nicholas Xenakis

Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal…

Nick Xenakis draws on his Capitol Hill experience to provide regulatory and legislative advice to clients in a range of industries, including technology. He has particular expertise in matters involving the Judiciary Committees, such as intellectual property, antitrust, national security, immigration, and criminal justice.

Nick joined the firm’s Public Policy practice after serving most recently as Chief Counsel for Senator Dianne Feinstein (D-CA) and Staff Director of the Senate Judiciary Committee’s Human Rights and the Law Subcommittee, where he was responsible for managing the subcommittee and Senator Feinstein’s Judiciary staff. He also advised the Senator on all nominations, legislation, and oversight matters before the committee.

Previously, Nick was the General Counsel for the Senate Judiciary Committee, where he managed committee staff and directed legislative and policy efforts on all issues in the Committee’s jurisdiction. He also participated in key judicial and Cabinet confirmations, including of an Attorney General and two Supreme Court Justices. Nick was also responsible for managing a broad range of committee equities in larger legislation, including appropriations, COVID-relief packages, and the National Defense Authorization Act.

Before his time on Capitol Hill, Nick served as an attorney with the Federal Public Defender’s Office for the Eastern District of Virginia. There he represented indigent clients charged with misdemeanor, felony, and capital offenses in federal court throughout all stages of litigation, including trial and appeal. He also coordinated district-wide habeas litigation following the Supreme Court’s decision in Johnson v. United States (invalidating the residual clause of the Armed Career Criminal Act).

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder counsels national and multinational companies across industries on data privacy, cybersecurity, and emerging technologies, including Artificial Intelligence and Internet of Things.

In particular, Jayne advises clients on compliance with federal, state, and global privacy frameworks, and counsels clients on navigating the…

Jayne Ponder counsels national and multinational companies across industries on data privacy, cybersecurity, and emerging technologies, including Artificial Intelligence and Internet of Things.

In particular, Jayne advises clients on compliance with federal, state, and global privacy frameworks, and counsels clients on navigating the rapidly evolving legal landscape. Her practice includes partnering with clients on the design of new products and services, drafting and negotiating privacy terms with vendors and third parties, developing privacy notices and consent forms, and helping clients design governance programs for the development and deployment of Artificial Intelligence and Internet of Things technologies.

Jayne routinely represents clients in privacy and consumer protection enforcement actions brought by the Federal Trade Commission and state attorneys general, including related to data privacy and advertising topics. She also helps clients articulate their perspectives through the rulemaking processes led by state regulators and privacy agencies.

As part of her practice, Jayne advises companies on cybersecurity incident preparedness and response, including by drafting, revising, and testing incident response plans, conducting cybersecurity gap assessments, engaging vendors, and analyzing obligations under breach notification laws following an incident.