This quarterly update summarizes key legislative and regulatory developments in the second quarter of 2023 related to key technologies and related topics, including Artificial Intelligence (“AI”), the Internet of Things (“IoT”), connected and automated vehicles (“CAVs”), data privacy and cybersecurity, and online teen safety.

Artificial Intelligence

AI continued to be an area of significant interest of both lawmakers and regulators throughout the second quarter of 2023.  Members of Congress continue to grapple with ways to address risks posed by AI and have held hearings, made public statements, and introduced legislation to regulate AI.  Notably, Senator Chuck Schumer (D-NY) revealed his “SAFE Innovation framework” for AI legislation.  The framework reflects five principles for AI – security, accountability, foundations, explainability, and innovation – and is summarized here.  There were also a number of AI legislative proposals introduced this quarter.  Some proposals, like the National AI Commission Act (H.R. 4223) and Digital Platform Commission Act (S. 1671), propose the creation of an agency or commission to review and regulate AI tools and systems.  Other proposals focus on mandating disclosures of AI systems.  For example, the AI Disclosure Act of 2023 (H.R. 3831) would require generative AI systems to include a specific disclaimer on any outputs generated, and the REAL Political Advertisements Act (S. 1596) would require political advertisements to include a statement within the contents of the advertisement if generative AI was used to generate any image or video footage.  Additionally, Congress convened hearings to explore AI regulation this quarter, including a Senate Judiciary Committee Hearing in May titled “Oversight of A.I.: Rules for Artificial Intelligence.”

There also were several federal Executive Branch and regulatory developments focused on AI in the second quarter of 2023, including, for example:

  • White House:  The White House issued a number of updates on AI this quarter, including the Office of Science and Technology Policy’s strategic plan focused on federal AI research and development, discussed in greater detail here.  The White House also requested comments on the use of automated tools in the workplace, including a request for feedback on tools to surveil, monitor, evaluate, and manage workers, described here.
  • CFPB:  The Consumer Financial Protection Bureau (“CFPB”) issued a spotlight on the adoption and use of chatbots by financial institutions.
  • FTC:  The Federal Trade Commission (“FTC”) continued to issue guidance on AI, such as guidance expressing the FTC’s view that dark patterns extend to AI, that generative AI poses competition concerns, and that tools claiming to spot AI-generated content must make accurate disclosures of their abilities and limitations.
  • HHS Office of National Coordinator for Health IT:  This quarter, the Department of Health and Human Services (“HHS”) released a proposed rule related to certified health IT that enables or interfaces with “predictive decision support interventions” (“DSIs”) that incorporate AI and machine learning technologies.  The proposed rule would require the disclosure of certain information about predictive DSIs to enable users to evaluate DSI quality and whether and how to rely on the DSI recommendations, including a description of the development and validation of the DSI.  Developers of certified health IT would also be required to implement risk management practices for predictive DSIs and make summary information about these practices publicly available.

Internet of Things

IoT developments in the second quarter of 2023 focused on children.  For example, Representation Kathy Castor (D-FL-14) introduced the Kids PRIVACY Act (H.R. 2801), which would amend the Children’s Online Privacy Protection Act (“COPPA”) to update and expand its coverage to encompass internet connected devices, including by requiring that entities subject to COPPA maintain appropriate safeguards for internet connected devices.  

Connected and Automated Vehicles

The National Highway Traffic Safety Administration (“NHTSA”) had a busy quarter.  On April 6, 2023, NHTSA published a notice and request for comments on its intention to request approval from the Office of Management and Budget for an extension of its information collection efforts under its Automated Vehicle Transparency Engagement for Safe Testing (“AV TEST”) Initiative, which involves the voluntary collection of information from entities testing vehicles equipped with automated driving systems (“ADS”) and from states and local authorities involved in the regulation of ADS testing.  The request is to extend information collection efforts under the AV TEST Initiative for three additional years (beginning from the date of approval).  The comment period closed on June 5, 2023.  Additionally, on June 13, 2023, NHTSA issued a notice of proposed rulemaking proposing to adopt a new Federal Motor Vehicle Safety Standard to require automatic emergency braking (“AEB”), including pedestrian AEB, systems on light vehicles.  This proposal would require that AEB requirements be phased in within four years of publication of a final rule.  Comments are due on or before August 14, 2023.

Also on June 13, NHTSA sent a letter to 22 automakers instructing them not to comply with a Right to Repair law recently enacted in Massachusetts, citing “significant safety concerns” such as vehicle crashes, injuries, or deaths.  The law requires automakers to allow access to their vehicles’ data so owners can get their vehicles fixed by independent repairers (rather than dealership service centers) if they choose.  The letter from NHTSA claims the Massachusetts law is in conflict with, and is therefore preempted by, the Motor Vehicle Safety Act.  NHTSA alleges that “[o]pen access to vehicle manufacturers’ telematics offerings with the ability to remotely send commands allows for manipulation of systems on a vehicle, including safety-critical functions such as steering, acceleration or braking, as well as equipment required by Federal Motor Vehicle Safety Standards such as air bags and electronic stability control.”

There also were developments at the Federal Communications Commission (“FCC”) relating to Cellular Vehicle-to-Everything (“C-V2X”) technology.  C-V2X technology is expected to increase safety and reduce traffic congestion by enabling vehicles to sense and communicate with each other and with other devices.  In April, the FCC approved a joint waiver request that will allow certain entities to start deploying C-V2X technology. 

Finally, building off of a 2022 Autonomous Vehicles Workshop, NIST announced that it will be holding a follow-on virtual workshop entitled “Standards and Performance Metrics for On-Road Automated Vehicles” from September 5-8, 2023.  The purpose of this workshop is to update the CAV community on NIST’s recent work in the area, provide a forum to provide feedback, and set a path forward to ensure that NIST’s efforts in developing standards and performance metrics provide the greatest value to the community.  Registration is free and now open.

Privacy & Cybersecurity

There continued to be minimal traction in Congress to advance a privacy framework in the second quarter of 2023.  Despite various promises to reintroduce the American Data Privacy Protection Act (“ADPPA”), the ADPPA has yet to be reintroduced this Congress.  Nevertheless, in late April, Representatives Anna Eshoo (D-CA-16) and Zoe Lofgren (D-CA-18) reintroduced the Online Privacy Act (H.R. 2701)—a comprehensive privacy bill that would create consumer rights, privacy notice requirements, information security requirements, and a digital privacy agency.  Federal lawmakers continue to express interest in a children’s privacy framework, as Senator Ed Markey introduced (D-MA) the Children and Teens’ Online Privacy and Protection Act (“COPPA 2.0”) (S. 1418), though the bill has not advanced.  

In contrast, there continued to be significant legislative focus on comprehensive privacy legislation at the state level.  State legislatures have passed a number of comprehensive privacy frameworks this quarter, including in Oregon, Texas, Indiana, and Delaware. Washington and Nevada passed consumer health data privacy bills.  Additionally, the Connecticut legislature passed amendments to the Connecticut Data Privacy Act, which add provisions related to health data and minors’ data.  

Although Congress did not advance significant cybersecurity legislation this quarter, there was significant regulatory activity.  The Securities and Exchange Commission (“SEC”) published an update to its rulemaking agenda, indicating that its previously-proposed cyber rules addressing disclosure requirements for incidents at publicly traded companies and registered investment advisors and funds may not be approved until October 2023.  Additional detail on this update is available here.  Additionally, the Cybersecurity and Infrastructure Security Agency released guidance on Security-by-Design and Security-by-Default principles for technology manufacturers.  The guidance was developed in coordination with the Federal Bureau of Investigation and the National Security Agency, as well as cybersecurity authorities in Australia, Canada, United Kingdom, Germany, Netherlands, and New Zealand.

Online Teen Safety

This quarter saw the introduction of a number of child and teen online safety bills at both the state and federal level.  In particular, lawmakers focused on age verification, parental consent, and heightened obligations for social media platform with users under the age of 18.  For example, in April 2023, Governor Huckabee Sanders (R-AR) signed into law the Social Media Safety Act.  The bill’s passing made Arkansas the second state to enact broad restrictions on social media use of minors, following Utah’s passing of the Social Media Regulation Act in March 2023.  The law, which takes effect on September 1, prohibits covered social media companies from permitting individuals under 18 from creating an account without the express consent of their parent or legal guardian and requires covered social media companies to verify the age of new users.  In Congress, a bi-partisan coalition led by Senator Brian Schatz (D-HI), introduced the Protecting Kids on Social Media Act (S. 1291), which would require social media platforms to verify the age of their users, mandate parental consent for under 18 users, and prohibit the use of algorithmic recommendation systems on individuals under 18.

We will continue to update you on meaningful developments in these quarterly updates and across our blogs.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Jorge Ortiz Jorge Ortiz

Jorge Ortiz is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity and the Technology and Communications Regulation Practice Groups.

Jorge advises clients on a broad range of privacy and cybersecurity issues, including topics related to…

Jorge Ortiz is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity and the Technology and Communications Regulation Practice Groups.

Jorge advises clients on a broad range of privacy and cybersecurity issues, including topics related to privacy policies and compliance obligations under U.S. state privacy regulations like the California Consumer Privacy Act.

Photo of Shayan Karbassi Shayan Karbassi

Shayan Karbassi helps clients across industries navigate complex national security and cybersecurity matters to include government and internal investigations, incident and crisis response, regulatory compliance, and litigation.

As part of his cyber practice, Shayan assists clients with cybersecurity incident response and notification obligations…

Shayan Karbassi helps clients across industries navigate complex national security and cybersecurity matters to include government and internal investigations, incident and crisis response, regulatory compliance, and litigation.

As part of his cyber practice, Shayan assists clients with cybersecurity incident response and notification obligations, government and internal investigations of False Claims Act (FCA) issues and insider threats, and compliance with new and evolving federal and state cybersecurity regulations. Shayan also advises U.S. government contractors on security compliance under U.S. national security laws and regulations including, among others, the National Industrial Security Program (NISPOM), Federal Risk and Authorization Management Program (FedRAMP), and other U.S. government cybersecurity regulations.

More broadly, Shayan helps clients navigate potential civil and criminal legal risks stemming from operations in certain high-risk jurisdictions. This includes advising clients on U.S. criminal and civil antiterrorism laws, conducting internal investigations of terrorism-financing and related issues, and litigating Anti-Terrorism Act (ATA) claims.

Shayan maintains an active pro bono litigation practice with a focus on human rights, freedom of information, and free media issues.

Before joining Covington, Shayan served as a member of the U.S. intelligence community, where he routinely provided strategic analysis to the President and other senior U.S. policymakers.

Photo of Olivia Dworkin Olivia Dworkin

Olivia Dworkin minimizes regulatory and litigation risks for clients in the medical device, pharmaceutical, biotechnology, eCommerce, and digital health industries through strategic advice on complex FDA issues, helping to bring innovative products to market while ensuring regulatory compliance.

With a focus on cutting-edge…

Olivia Dworkin minimizes regulatory and litigation risks for clients in the medical device, pharmaceutical, biotechnology, eCommerce, and digital health industries through strategic advice on complex FDA issues, helping to bring innovative products to market while ensuring regulatory compliance.

With a focus on cutting-edge medical technologies and digital health products and services, Olivia regularly helps new and established companies navigate a variety of state and federal regulatory, legislative, and compliance matters throughout the total product lifecycle. She has experience counseling clients on the development, FDA regulatory classification, and commercialization of digital health tools, including clinical decision support software, mobile medical applications, general wellness products, medical device data systems, administrative support software, and products that incorporate artificial intelligence, machine learning, and other emerging technologies.

Olivia also assists clients in advocating for legislative and regulatory policies that will support innovation and the safe deployment of digital health tools, including by drafting comments on proposed legislation, frameworks, whitepapers, and guidance documents. Olivia keeps close to the evolving regulatory landscape and is a frequent contributor to Covington’s Digital Health blog. Her work also has been featured in the Journal of Robotics, Artificial Intelligence & Law, Law360, and the Michigan Journal of Law and Mobility.

Prior to joining Covington, Olivia was a fellow at the University of Michigan Veterans Legal Clinic, where she gained valuable experience as the lead attorney successfully representing clients at case evaluations, mediations, and motion hearings. At Michigan Law, Olivia served as Online Editor of the Michigan Journal of Gender and Law, president of the Trial Advocacy Society, and president of the Michigan Law Mock Trial Team. She excelled in national mock trial competitions, earning two Medals for Excellence in Advocacy from the American College of Trial Lawyers and being selected as one of the top sixteen advocates in the country for an elite, invitation-only mock trial tournament.

Photo of Hensey A. Fenton III Hensey A. Fenton III

Hensey Fenton specializes in providing advice and guidance to clients on legislative and regulatory strategies. Hensey counsels clients on a myriad of issues in the policy and regulatory space, including issues involving cybersecurity, financial services, artificial intelligence, digital assets, international trade and development…

Hensey Fenton specializes in providing advice and guidance to clients on legislative and regulatory strategies. Hensey counsels clients on a myriad of issues in the policy and regulatory space, including issues involving cybersecurity, financial services, artificial intelligence, digital assets, international trade and development, and tax.

Another facet of Hensey’s practice involves cutting-edge legal issues in the cybersecurity space. Having published scholarly work in the areas of cybersecurity and cyberwarfare, Hensey keeps his finger on the pulse of this fast-developing legal field. His Duke Journal of Comparative & International Law article, “Proportionality and its Applicability in the Realm of Cyber Attacks,” was highlighted by the Rutgers Computer and Technology Law Journal as one of the most important and timely articles on cyber, technology and the law. Hensey counsels clients on preparing for and responding to cyber-based attacks. He regularly engages with government and military leaders to develop national and global strategies for complex cyber issues and policy challenges.

Hensey’s practice also includes advising international clients on various policy, legal and regulatory challenges, especially those challenges facing developing nations in the Middle East. Armed with a distinct expertise in Middle Eastern foreign policy and the Arabic language, Hensey brings a multi-faceted approach to his practice, recognizing the specific policy and regulatory concerns facing clients in the region.

Hensey is also at the forefront of important issues involving Diversity, Equity and Inclusion (DEI). He assists companies in developing inclusive and sustainable DEI strategies that align with and incorporate core company values and business goals.

Prior to joining Covington, Hensey served as a Judicial Law Clerk for the Honorable Judge Johnnie B. Rawlinson, United States Court of Appeals for the Ninth Circuit. He also served as a Diplomatic Fellow in the Kurdistan Regional Government’s Representation (i.e. Embassy) in Washington, DC.

Photo of Jemie Fofanah Jemie Fofanah

Jemie Fofanah is an associate in the firm’s Washington, DC office. She is a member of the Privacy and Cybersecurity Practice Group and the Technology and Communication Regulatory Practice Group. She also maintains an active pro bono practice with a focus on criminal…

Jemie Fofanah is an associate in the firm’s Washington, DC office. She is a member of the Privacy and Cybersecurity Practice Group and the Technology and Communication Regulatory Practice Group. She also maintains an active pro bono practice with a focus on criminal defense and family law.

Photo of Jennifer Johnson Jennifer Johnson

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors…

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors, television companies, trade associations, and other entities on a wide range of media and technology matters. Jennifer has three decades of experience advising clients in the communications, media and technology sectors, and has held leadership roles in these practices for more than twenty years. On technology issues, she collaborates with Covington’s global, multi-disciplinary team to assist companies navigating the complex statutory and regulatory constructs surrounding this evolving area, including product counseling and technology transactions related to connected and autonomous vehicles, internet connected devices, artificial intelligence, smart ecosystems, and other IoT products and services. Jennifer serves on the Board of Editors of The Journal of Robotics, Artificial Intelligence & Law.

Jennifer assists clients in developing and pursuing strategic business and policy objectives before the Federal Communications Commission (FCC) and Congress and through transactions and other business arrangements. She regularly advises clients on FCC regulatory matters and advocates frequently before the FCC. Jennifer has extensive experience negotiating content acquisition and distribution agreements for media and technology companies, including program distribution agreements, network affiliation and other program rights agreements, and agreements providing for the aggregation and distribution of content on over-the-top app-based platforms. She also assists investment clients in structuring, evaluating, and pursuing potential investments in media and technology companies.

Photo of Nicholas Xenakis Nicholas Xenakis

Nick Xenakis draws on his Capitol Hill and legal experience to provide public policy and crisis management counsel to clients in a range of industries.

Nick assists clients in developing and implementing policy solutions to litigation and regulatory matters, including on issues involving…

Nick Xenakis draws on his Capitol Hill and legal experience to provide public policy and crisis management counsel to clients in a range of industries.

Nick assists clients in developing and implementing policy solutions to litigation and regulatory matters, including on issues involving antitrust, artificial intelligence, bankruptcy, criminal justice, financial services, immigration, intellectual property, life sciences, national security, and technology. He also represents companies and individuals in investigations before U.S. Senate and House Committees.

Nick previously served as General Counsel for the U.S. Senate Judiciary Committee, where he managed committee staff and directed legislative efforts. He also participated in key judicial and Cabinet confirmations, including of Attorneys General and Supreme Court Justices. Before his time on Capitol Hill, Nick served as an attorney with the Federal Public Defender’s Office for the Eastern District of Virginia.

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder provides strategic advice to national and multinational companies across industries on existing and emerging data privacy, cybersecurity, and artificial intelligence laws and regulations.

Jayne’s practice focuses on helping clients launch and improve products and services that involve laws governing data privacy…

Jayne Ponder provides strategic advice to national and multinational companies across industries on existing and emerging data privacy, cybersecurity, and artificial intelligence laws and regulations.

Jayne’s practice focuses on helping clients launch and improve products and services that involve laws governing data privacy, artificial intelligence, sensitive data and biometrics, marketing and online advertising, connected devices, and social media. For example, Jayne regularly advises clients on the California Consumer Privacy Act, Colorado AI Act, and the developing patchwork of U.S. state data privacy and artificial intelligence laws. She advises clients on drafting consumer notices, designing consent flows and consumer choices, drafting and negotiating commercial terms, building consumer rights processes, and undertaking data protection impact assessments. In addition, she routinely partners with clients on the development of risk-based privacy and artificial intelligence governance programs that reflect the dynamic regulatory environment and incorporate practical mitigation measures.

Jayne routinely represents clients in enforcement actions brought by the Federal Trade Commission and state attorneys general, particularly in areas related to data privacy, artificial intelligence, advertising, and cybersecurity. Additionally, she helps clients to advance advocacy in rulemaking processes led by federal and state regulators on data privacy, cybersecurity, and artificial intelligence topics.

As part of her practice, Jayne also advises companies on cybersecurity incident preparedness and response, including by drafting, revising, and testing incident response plans, conducting cybersecurity gap assessments, engaging vendors, and analyzing obligations under breach notification laws following an incident.

Jayne maintains an active pro bono practice, including assisting small and nonprofit entities with data privacy topics and elder estate planning.