On November 21, 2019, the European Commission’s Expert Group on Liability and New Technologies – New Technologies Formation (“NTF”) published its Report on Liability for Artificial Intelligence and other emerging technologies.  The Commission tasked the NTF with establishing the extent to which liability frameworks in the EU will continue to operate effectively in relation to emerging digital technologies (including artificial intelligence, the internet of things, and distributed ledger technologies).  This report presents the NTF’s findings and recommendations.

Generally, liability regimes across EU Member States are not harmonized, although there are some sectors where there is some harmonization (e.g., in relation to product liability).  Nonetheless, the NTF found that existing regimes do provide basic protection for people that suffer harm caused by emerging digital technologies, and in many cases these frameworks (e.g., product liability for defective IoT devices) function effectively, or at least provide a starting point for assessing liability.

The NTF also found, however, that certain characteristics of new technologies may make it more difficult for victims to make successful claims in particular circumstances, and in others the allocation of liability may be unfair or inefficient.  For example, where a victim is required to prove that damage suffered was caused by a some conduct attributable to an AI tool, that may require analysis of complex underlying code, which may be beyond the ability of many victims.

Consequently, the report provides high-level recommendations on how existing liability concepts might be applied, with adjustments, to emerging digital technologies, and does not recommend a complete overhaul of liability rules in the EU.

More specifically, key findings from the NTF include:

  • It is not necessary to give autonomous systems a separate legal personality for liability purposes.  In particular, this would require attributing funds to such a system, effectively putting a cap on liability, which may be inadequate to meet all compensation claims.
  • In certain circumstances, it may be appropriate to impose strict liability for damage caused by emerging technologies (e.g., liability without any finding of fault).  However, this should be limited to situations where technology may create comparable risks to those subject to strict liability under existing frameworks.  This is likely to arise primarily in circumstances where digital technologies move in public spaces (e.g., drones, autonomous cars).
  • Vicarious liability should be imposed on operators of technologies in line with existing vicarious liability regimes.  For example, a company might be vicariously liable where an autonomous tool causes harm that, if caused by a human employee / auxiliary, would normally give rise to vicarious liability.  The NTF also states that once autonomous technology capabilities outperform humans, there should be a higher expected standard of those technologies, and dropping below that higher standard would lead to vicarious liability for the operator.
  • The burden of proof for both causation and damage should, as a general rule, be on the victim (as is the case under current liability frameworks).  In certain circumstances, however, it may be appropriate to reverse or lower this burden, particularly where such a burden would be disproportionately high.  In particular, the burden of proof should be reversed if damage was caused by a breach of some other set of rules (e.g., rules on information security).
  • Where multiple people or companies cooperate to provide different elements of some technology (e.g., if they coordinate marketing or have a degree of technical interdependence), they should be jointly or severally liable for harms caused by that system.
  • New duties of care may need to be developed, in particular on operators of technologies to choose appropriate systems to achieve their goals and to monitor the use of those systems, as well as on producers of those technologies to design products that enable operators to comply with their duties and to monitor products after they go into circulation.
  • Insurance may, in the future, need to be mandatory for certain technologies if the potential harm is more frequent or more severe, and if operators in question are unlikely to be able to provide redress to individuals (e.g., for startups with limited capital for redress).

At this stage, the report does not provide specific recommendations as to how these recommendations should be implemented into EU or national laws.  However, the Expert Group also incorporates a separate Product Liability Directive Formation (“PLDF”), whose task is to provide expertise on the applicability of Directive 85/374/EEC (the Product Liability Directive) to these new technologies.  Going forward, therefore, further reports from the NTF, or publications from the PLDF, are likely to provide more concrete recommendations as to how, if at all, laws at an EU level should be changed to ensure that liability mechanisms are fit for purpose in the context of AI and other developing technologies.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lisa Peets Lisa Peets

Lisa Peets leads the Technology Regulatory and Policy practice in the London office and is a member of the firm’s Management Committee. Lisa divides her time between London and Brussels, and her practice embraces regulatory counsel and legislative advocacy. In this context, she…

Lisa Peets leads the Technology Regulatory and Policy practice in the London office and is a member of the firm’s Management Committee. Lisa divides her time between London and Brussels, and her practice embraces regulatory counsel and legislative advocacy. In this context, she has worked closely with leading multinationals in a number of sectors, including many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU law issues, including data protection and related regimes, copyright, e-commerce and consumer protection, and the rapidly expanding universe of EU rules applicable to existing and emerging technologies. Lisa also routinely advises clients in and outside of the technology sector on trade related matters, including EU trade controls rules.

According to the latest edition of Chambers UK (2022), “Lisa is able to make an incredibly quick legal assessment whereby she perfectly distils the essential matters from the less relevant elements.” “Lisa has subject matter expertise but is also able to think like a generalist and prioritise. She brings a strategic lens to matters.”

Photo of Marty Hansen Marty Hansen

Martin Hansen has represented some of the world’s leading information technology, telecommunications, and pharmaceutical companies on a broad range of cutting edge international trade, intellectual property, and competition issues. Martin has extensive experience in advising clients on matters arising under the World Trade…

Martin Hansen has represented some of the world’s leading information technology, telecommunications, and pharmaceutical companies on a broad range of cutting edge international trade, intellectual property, and competition issues. Martin has extensive experience in advising clients on matters arising under the World Trade Organization agreements, treaties administered by the World Intellectual Property Organization, bilateral and regional free trade agreements, and other trade agreements.

Drawing on ten years of experience in Covington’s London and DC offices his practice focuses on helping innovative companies solve challenges on intellectual property and trade matters before U.S. courts, the U.S. government, and foreign governments and tribunals. Martin also represents software companies and a leading IT trade association on electronic commerce, Internet security, and online liability issues.

Photo of Paul Maynard Paul Maynard

Paul Maynard is an associate in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online…

Paul Maynard is an associate in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.

Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.