Introduction

In this update, we detail the key legislative developments in the second quarter of 2021 related to artificial intelligence (“AI”), the Internet of Things (“IoT”), connected and automated vehicles (“CAVs”), and federal privacy legislation.  As we recently covered on May 12,  President Biden signed an Executive Order to strengthen the federal government’s ability to respond to and prevent cybersecurity threats, including by removing obstacles to sharing threat information between private sector entities and federal agencies and modernizing federal systems.  On the hill, lawmakers have introduced a number of proposals to regulate AI, IoT, CAVs, and privacy.

Artificial Intelligence

In Q2, members of Congress introduced a variety of legislative proposals to regulate AI—ranging from light touches to more prescriptive approaches.

  • A number of bills would provide funding for AI-related research and training. Most notably, the United States Innovation and Competition Act of 2021 ( 1260), introduced by Senator Chuck Schumer (D-NY) and which passed by the Senate, would invest more than $200 billion into U.S. scientific and technological innovation over the next five years.  In particular, the bill would create the Directorate for Technology and Innovation within the National Science Foundation to research AI and machine learning, among other areas.  The Artificial Intelligence for the Military Act of 2021 (S. 1776), introduced by Senator Rob Portman (R-OH), would require the introduction of curriculum for professional military education to incorporate courses of emerging technologies, like AI.
  • Several bills introduced this quarter have focused on the privacy implications of AI. For instance, the Mind Your Own Business Act of 2021 ( 1444), introduced by Senator Ron Wyden (D-OR), would authorize the Federal Trade Commission (“FTC”) to create regulations requiring covered entities to, among other requirements, conduct impact assessments of “high-risk automated decision systems” (which includes certain AI tools) and “high-risk information systems” that “pose a significant risk to the privacy or security” of consumers’ personal information.  Likewise, the Algorithmic Justice and Online Platform Transparency Act of 2021 (S. 1896), introduced by Senator Ed Markey (D-MA), would require online platforms to describe to users the types of algorithmic processes they employ and the information they collected; publish annual public reports detailing their content moderation practices; and maintain detailed records describing their algorithmic process for review by the FTC.

Agencies have mirrored Congressional activities, focusing on additional investments to research AI developments and rules on the use of AI.

  • The FTC released guidance suggesting that biased AI may violate: Section 5 of the FTC Act (“The FTC Act prohibits unfair or deceptive practices. That would include the sale or use of – for example – racially biased algorithms.”); the Fair Credit Reporting Act (“The FCRA comes into play in certain circumstances where an algorithm is used to deny people employment, housing, credit, insurance, or other benefits.”); and the Equal Credit Opportunity Act (“The ECOA makes it illegal for a company to use a biased algorithm that results in credit discrimination on the basis of race, color, religion, national origin, sex, marital status, age, or because a person receives public assistance.”).
  • The Office of Management and Budget submitted its request for discretionary funding—$916 million, an increase of $128 million over the 2021 enacted level, to expand scientific and technological research at the National Institute of Standards and Technology (“NIST”) on AI and quantum information science. The discretionary request also establishes a new Directorate for Technology, Innovation, and Partnerships within the National Science Foundation to expedite technology development in AI and quantum information science.

Internet of Things

Federal lawmakers introduced a number of proposals focused on directing federal agencies to study technological and security challenges related to IoT, including with respect to national security, disclosures to consumers, and cybersecurity certification programs.

  • Several bills introduced this quarter focus on addressing national security challenges related to IoT, such as the Internet of Things Readiness Act of 2021 (R. 981), introduced by Representative Suzan DelBene (D-WA-1), the Strengthening Trade, Regional Alliances, Technology, and Economic and Geopolitical Initiatives Concerning China (“STRATEGIC”) Act (S. 687), introduced by Senator James Risch (R-ID), and the Securing American Leadership in Science and Technology (“SALSTA”) Act of 2021 (H.R. 2153), introduced by Representative Frank Lucas (R-OK-3). These bills focus on developing standards for secure IoT development and create working groups focused on developing U.S. leadership in IoT development.
  • Other developments this quarter focused on consumer labels for IoT devices. Representative John Curtis (R-UT-3) introduced the Informing Consumers about Smart Devices Act (R. 3898), which would require the FTC to work alongside industry leaders to establish guidelines for properly disclosing the potential for their products to contain audio or visual recording capabilities that would not be clearly obvious to a reasonable person.  Notably, President Biden signed the Executive Order on Improving the Nation’s Cybersecurity, which directs NIST to create pilot programs to establish criteria for product labels to educate consumers about the cybersecurity capabilities of IoT devices and requires NIST to work with the FTC on such consumer labels.
  • Another proposal focuses on cybersecurity certification programs for IoT devices. Senator Ed Markey (D-MA) introduced the Cyber Shield Act of 2021 ( 965), which would create a voluntary cybersecurity certification program for IoT devices, including laptops, cameras, and cell phones.

Connected and Autonomous Vehicles

Federal lawmakers focused legislative proposals on the safe deployment of CAVs, including proposals focused on federal frameworks for CAVs and requirements for CAV development.

  • Representative Bob Latta (R-OH-5) introduced the Safely Ensuring Lives Future Deployment and Research Act (“SELF DRIVE Act”) (R. 3711), which would create a federal framework to assist agencies and industries deploy CAVs around the country and establish a Highly Automated Vehicle Advisory Council within the National Highway Traffic Safety Administration (“NHTSA”).
  • Other proposals focus on technical requirements for CAVs. The Surface Transportation Investment Act of 2021 ( 2016), introduced by Senator Maria Cantwell (D-WA), would require new automobiles to be equipped with (i) an automatic braking system that alerts the driver if the distance to a vehicle or object is closing too quickly and a collision is imminent and (ii) an automated lane departure system that warns the driver to maintain the lane of travel and course corrects in the event a driver fails to do so.
  • Lawmakers have also introduced grant programs this quarter to encourage the development of CAVs. The Surface Transportation Advanced through Reform, Technology, and Efficient Review (“STARTER”) Act 2.0 (R. 3341), introduced by Representative Sam Graves (R-MO-6),  would establish new competitive grant programs related to connected vehicle deployment and the safe integration of automated driving systems.

There were a number of updates related to CAV safety in the regulatory space as well:

  • In early June, the Department of Transportation released its Spring regulatory agenda, which emphasized creating a safe and predictable environment for CAVs by requiring rigorous testing standards for CAVs and setting up a national incident database to document crashes involving CAVs. Related to this database, NHTSA issued an order on June 29 requiring manufacturers and operators of vehicles equipped with certain automated driving features to repot certain crashes to the regulatory agency.
  • In response to an Advance Notice of Proposed Rulemaking (ANPRM) – Framework for Automated Driving System Safety, NHTSA received public comments on the key components that can meet the need for motor vehicle safety while enabling vehicle decisions for the four primary functions of automated driving systems: sensing, perception, planning, and control.

Data Privacy

Legislators introduced a number of privacy bills in Q2, including comprehensive data privacy proposals with new consent requirements and new approaches to enforcement, and other bills focused on specific topical areas related to privacy, such as the Fourth Amendment and children’s privacy.

  • A number of bills introduced this quarter focus on consent requirements in a comprehensive data privacy framework. One of these proposals, the Information Transparency and Personal Data Control Act (R. 1816), introduced by Representative Suzan DelBene (D-WA-1), would require affirmative, express consent to sell, share, or disclose sensitive personal information, and covered entities would be required to provide consumers with the option to opt-out of the use of their non-sensitive personal information at any time.  Relatedly, Senator Jerry Moran (R-KS) introduced the Consumer Data Privacy and Security Act (S. 1494), which would require covered entities to provide consumers with a means to withdraw consent at any time.
  • Other federal privacy legislative proposals this quarter would give the FTC and state attorneys general authority to enforce the law, though one proposal would create a new, independent Data Protection Agency to enforce federal privacy in the U.S. The Data Protection Act of 2021 ( 2134), introduced by Senator Kirsten Gillibrand (D-NY), would prohibit unfair, deceptive, or discriminatory data practices, and the agency would review the privacy implications of any merger that involves the data of at least 50,000 users.
  • Other data privacy legislative proposals introduced this quarter focused on specific topical areas, such as the Fourth Amendment and children’s privacy. The Fourth Amendment is Not for Sale Act ( 1265), introduced by Senator Ron Wyden (D-OR) and a bipartisan group of 19-co-sponsors, would limit the federal government’s ability to purchase or obtain data, including metadata, from data brokers.  The Children and Teens’ Online Privacy Protection Act (S. 1628), introduced by Senator Ed Markey (D-MA), would amend the Children’s Online Privacy Protection Act to prohibit companies from collecting information on children aged 13-15 without their consent.  The bill also would ban targeted advertising directed at children, would require the creation of an “erase button” to allow users to delete a child’s personal information, and would establish a Youth Marketing and Privacy Division at the FTC.

We will continue to update you on meaningful developments in these quarterly updates and across our blogs.

 

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Jennifer Johnson Jennifer Johnson

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors…

Jennifer Johnson is a partner specializing in communications, media and technology matters who serves as Co-Chair of Covington’s Technology Industry Group and its global and multi-disciplinary Artificial Intelligence (AI) and Internet of Things (IoT) Groups. She represents and advises technology companies, content distributors, television companies, trade associations, and other entities on a wide range of media and technology matters. Jennifer has almost three decades of experience advising clients in the communications, media and technology sectors, and has held leadership roles in these practices for almost twenty years. On technology issues, she collaborates with Covington’s global, multi-disciplinary team to assist companies navigating the complex statutory and regulatory constructs surrounding this evolving area, including product counseling and technology transactions related to connected and autonomous vehicles, internet connected devices, artificial intelligence, smart ecosystems, and other IoT products and services. Jennifer serves on the Board of Editors of The Journal of Robotics, Artificial Intelligence & Law.

Jennifer assists clients in developing and pursuing strategic business and policy objectives before the Federal Communications Commission (FCC) and Congress and through transactions and other business arrangements. She regularly advises clients on FCC regulatory matters and advocates frequently before the FCC. Jennifer has extensive experience negotiating content acquisition and distribution agreements for media and technology companies, including program distribution agreements, network affiliation and other program rights agreements, and agreements providing for the aggregation and distribution of content on over-the-top app-based platforms. She also assists investment clients in structuring, evaluating, and pursuing potential investments in media and technology companies.

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder counsels national and multinational companies across industries on data privacy, cybersecurity, and emerging technologies, including Artificial Intelligence and Internet of Things.

In particular, Jayne advises clients on compliance with federal, state, and global privacy frameworks, and counsels clients on navigating the…

Jayne Ponder counsels national and multinational companies across industries on data privacy, cybersecurity, and emerging technologies, including Artificial Intelligence and Internet of Things.

In particular, Jayne advises clients on compliance with federal, state, and global privacy frameworks, and counsels clients on navigating the rapidly evolving legal landscape. Her practice includes partnering with clients on the design of new products and services, drafting and negotiating privacy terms with vendors and third parties, developing privacy notices and consent forms, and helping clients design governance programs for the development and deployment of Artificial Intelligence and Internet of Things technologies.

Jayne routinely represents clients in privacy and consumer protection enforcement actions brought by the Federal Trade Commission and state attorneys general, including related to data privacy and advertising topics. She also helps clients articulate their perspectives through the rulemaking processes led by state regulators and privacy agencies.

As part of her practice, Jayne advises companies on cybersecurity incident preparedness and response, including by drafting, revising, and testing incident response plans, conducting cybersecurity gap assessments, engaging vendors, and analyzing obligations under breach notification laws following an incident.

Photo of Andrew Longhi Andrew Longhi

Andrew Longhi advises national and multinational companies across industries on a wide range of regulatory, compliance, and enforcement matters involving data privacy, telecommunications, and emerging technologies.

Andrew’s practice focuses on advising clients on how to navigate the rapidly evolving legal landscape of state…

Andrew Longhi advises national and multinational companies across industries on a wide range of regulatory, compliance, and enforcement matters involving data privacy, telecommunications, and emerging technologies.

Andrew’s practice focuses on advising clients on how to navigate the rapidly evolving legal landscape of state, federal, and international data protection laws. He proactively counsels clients on the substantive requirements introduced by new laws and shifting enforcement priorities. In particular, Andrew routinely supports clients in their efforts to launch new products and services that implicate the laws governing the use of data, connected devices, biometrics, and telephone and email marketing.

Andrew assesses privacy and cybersecurity risk as a part of diligence in complex corporate transactions where personal data is a key asset or data processing issues are otherwise material. He also provides guidance on generative AI issues, including privacy, Section 230, age-gating, product liability, and litigation risk, and has drafted standards and guidelines for large-language machine-learning models to follow. Andrew focuses on providing risk-based guidance that can keep pace with evolving legal frameworks.

Photo of Lindsay Brewer Lindsay Brewer

Lindsay advises clients on environmental, human rights, product safety, and public policy matters.

She counsels clients seeking to set sustainability goals; track their progress on environmental, social, and governance topics; and communicate their achievements to external stakeholders in a manner that mitigates legal…

Lindsay advises clients on environmental, human rights, product safety, and public policy matters.

She counsels clients seeking to set sustainability goals; track their progress on environmental, social, and governance topics; and communicate their achievements to external stakeholders in a manner that mitigates legal risk. She also advises clients seeking to engage with regulators and policymakers on environmental policy. Lindsay has extensive experience advising clients on making environmental disclosures and public marketing claims related to their products and services, including under the FTC’s Green Guides and state consumer protection laws.

Lindsay’s legal and regulatory advice spans a range of topics, including climate, air, water, human rights, environmental justice, and product safety and stewardship. She has experience with a wide range of environmental and safety regimes, including the Federal Trade Commission Act, the Clean Air Act, the Consumer Product Safety Act, the Federal Motor Vehicle Safety Standards, and the Occupational Safety and Health Act. Lindsay works with companies of various sizes and across multiple sectors, including technology, energy, financial services, and consumer products.

Photo of Nira Pandya Nira Pandya

Nira Pandya is a member of the firm’s Technology and IP Transactions Practice Group in Boston.

With a broad practice that spans a variety of industries, Nira routinely advises clients with their most complex commercial transactions and strategic collaborations involving technology, intellectual property…

Nira Pandya is a member of the firm’s Technology and IP Transactions Practice Group in Boston.

With a broad practice that spans a variety of industries, Nira routinely advises clients with their most complex commercial transactions and strategic collaborations involving technology, intellectual property, and data, with a focus on issues around IP ownership and licensing, artificial intelligence, software development, and information technology services.

As a member of the firm’s Digital Health Initiative, Nira counsels pharmaceutical, medical device, healthcare, and technology clients on commercial and intellectual property considerations that arise in partnerships and collaborations at the intersection of life sciences and technology.

Nira leverages in-house experience gained during her secondment to a leading technology company, where she partnered with business clients and translated legal advice into practical solutions. Prior to joining the firm’s Technology and IP Transactions practice group, Nira advised private and public companies on mergers and acquisitions, joint ventures, strategic investments, and other corporate transactions.