On September 28, 2022, the European Commission published its long-promised proposal for an AI Liability Directive.  The draft Directive is intended to complement the EU AI Act, which the EU’s institutions are still negotiating.  In parallel, the European Commission also published its proposal to update the EU’s 1985 Product Liability Directive.  If adopted, the proposals will change the liability rules for software and AI systems in the EU.

The draft AI Liability Directive establishes rules applicable to non-contractual, fault-based civil claims involving AI systems.  Specifically, the proposal establishes rules that would govern the preservation and disclosure of evidence in cases involving high-risk AI, as well as rules on the burden of proof and corresponding rebuttable presumptions.  If adopted as proposed, the draft AI Liability Directive will apply to damages that occur two years or more after the Directive enters into force; five years after its entry into force, the Commission will consider the need for rules on no-fault liability for AI claims.

As for the draft Directive on Liability of Defective Products, if adopted, EU Member States will have one year from its entry into force to implement it in their national laws.  The draft Directive would apply to products placed on the market one year after it enters into force.

Draft EU AI Liability Directive

The draft Directive gives courts the power to order providers or users of high-risk AI systems to disclose (and/or preserve) information about their systems to persons who seek this information to initiate (or decide whether to initiate) redress proceedings against the provider or user.  A court may issue such an order upon the request of (a) a “potential claimant,” who has already requested this information directly from the provider or user but not received it, or (b) a claimant who has initiated proceedings.  The requestor must support the request by presenting “sufficient” facts and evidence showing why the requestor suspects that the high-risk AI system caused the alleged damage(s).

Courts will only order a provider or user to disclose as must information as is necessary and proportionate to support a (potential) claim for damages.  The court will take into account the legitimate interests of all parties, including any trade secrets.  If a disclosure order covers information that is considered a trade secret which a court deems confidential pursuant to the EU Trade Secret Directive, the court may take measures necessary to preserve the confidentiality of that information during the proceedings.  If the provider or user does not comply with the court’s order to disclose information, the court may assert a rebuttable presumption that the provider or user failed to comply with the provision(s) of the (draft) AI Act that the requestor alleges were violated.

In addition, the draft Directive sets out a number of circumstances in which a court may presume a (causal) link between (a) the fault of the provider or user of any AI system (whether “high-risk” or not), and (b) the output produced by the AI system or its failure to produce such an output.  For high-risk AI systems, this presumption applies if the claimant has demonstrated the provider or user’s non-compliance with certain obligations under the (draft) AI Act, subject to certain exceptions and restrictions.  For example, the presumption will not apply if the court finds that the claimant has sufficient evidence and expertise to prove a causal link.

Draft EU Directive on Liability of Defective Products

The draft EU Directive on Liability of Defective Products  is designed to hold manufacturers liable for certain damages caused by a defect in their “products” (which encompass both finished products and components of products).  Under the draft Directive, injured persons would have three years from the moment that they became aware (or should have become aware) of the (1) damage, (2) the defect in question, and (3) the identity of the manufacturer, to bring redress proceedings against the manufacturer.  However, they may only start proceedings against products that were placed on the market, put into service, or substantially modified within the past 10 years.  In addition, injured persons must prove the damage, defect, and causal relationship between the two.

The draft Directive would also:

  • clarify that the term “products” encompasses software (including AI systems);
  • expand the factors that a court may take into account to determine whether a product is defective;
  • expand the definition of “damages” to cover, for example, harm to, or destruction of, any property regardless of the amount, and loss or corruption of data that is not used exclusively for professional purposes;
  • set out provisions similar to the draft AI Liability Directive that give courts the power to issue information disclosure orders;
  • identify a number of circumstances when a court can presume a product’s defect, including in case of non-disclosure of information following a court’s order, non-compliance with mandatory safety requirements, or where a claimant faces excessive difficulties to prove a defect or causal link (the manufacturer can challenge this presumption, and there are a number of circumstances listed in the draft Directive that exempt the manufacturer from liability); and
  • clarify that manufacturers are liable for the defectiveness of a product that remains in their control after having placed it onto the market, including where the defect t is attributed to software updates or modifications to AI systems.

*                      *                      *

The Covington Team will continue to monitor developments on these proposals, and we are happy to assist clients if they have queries.


Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Lisa Peets Lisa Peets

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more…

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice encompasses regulatory compliance and investigations alongside legislative advocacy. For more than two decades, she has worked closely with many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU and UK legal frameworks affecting technology providers, including data protection, content moderation, artificial intelligence, platform regulation, copyright, e-commerce and consumer protection, and the rapidly expanding universe of additional rules applicable to technology, data and online services.

Lisa also supports Covington’s disputes team in litigation involving technology providers.

According to Chambers UK (2024 edition), “Lisa provides an excellent service and familiarity with client needs.”

Photo of Nicholas Shepherd Nicholas Shepherd

Nicholas Shepherd is an associate in Covington’s Washington, DC office, where he is a member of the Data Privacy and Cybersecurity Practice Group, advising clients on compliance with all aspects of the European General Data Protection Regulation (GDPR), ePrivacy Directive, European direct marketing…

Nicholas Shepherd is an associate in Covington’s Washington, DC office, where he is a member of the Data Privacy and Cybersecurity Practice Group, advising clients on compliance with all aspects of the European General Data Protection Regulation (GDPR), ePrivacy Directive, European direct marketing laws, and other privacy and cybersecurity laws worldwide. Nick counsels on topics that include adtech, anonymization, children’s privacy, cross-border transfer restrictions, and much more, providing advice tailored to product- and service-specific contexts to help clients apply a risk-based approach in addressing requirements in relation to transparency, consent, lawful processing, data sharing, and others.

A U.S.-trained and qualified lawyer with 7 years of working experience in Europe, Nick leverages his multi-faceted legal background and international experience to provide clear and pragmatic advice to help organizations address their privacy compliance obligations across jurisdictions.

Photo of Anna Oberschelp de Meneses Anna Oberschelp de Meneses

Anna Sophia Oberschelp de Meneses is an associate in the Data Privacy and Cybersecurity Practice Group.

Anna is a qualified Portuguese lawyer, but is both a native Portuguese and German speaker.

Anna advises companies on European data protection law and helps clients coordinate…

Anna Sophia Oberschelp de Meneses is an associate in the Data Privacy and Cybersecurity Practice Group.

Anna is a qualified Portuguese lawyer, but is both a native Portuguese and German speaker.

Anna advises companies on European data protection law and helps clients coordinate international data protection law projects.

She has obtained a certificate for “corporate data protection officer” by the German Association for Data Protection and Data Security (“Gesellschaft für Datenschutz und Datensicherheit e.V.”). She is also Certified Information Privacy Professional Europe (CIPPE/EU) by the International Association of Privacy Professionals (IAPP).

Anna also advises companies in the field of EU consumer law and has been closely tracking the developments in this area.

Her extensive language skills allow her to monitor developments and help clients tackle EU Data Privacy, Cybersecurity and Consumer Law issues in various EU and ROW jurisdictions.