On November 21, 2019, the European Commission’s Expert Group on Liability and New Technologies – New Technologies Formation (“NTF”) published its Report on Liability for Artificial Intelligence and other emerging technologies.  The Commission tasked the NTF with establishing the extent to which liability frameworks in the EU will continue to operate effectively in relation to emerging digital technologies (including artificial intelligence, the internet of things, and distributed ledger technologies).  This report presents the NTF’s findings and recommendations.

Generally, liability regimes across EU Member States are not harmonized, although there are some sectors where there is some harmonization (e.g., in relation to product liability).  Nonetheless, the NTF found that existing regimes do provide basic protection for people that suffer harm caused by emerging digital technologies, and in many cases these frameworks (e.g., product liability for defective IoT devices) function effectively, or at least provide a starting point for assessing liability.

The NTF also found, however, that certain characteristics of new technologies may make it more difficult for victims to make successful claims in particular circumstances, and in others the allocation of liability may be unfair or inefficient.  For example, where a victim is required to prove that damage suffered was caused by a some conduct attributable to an AI tool, that may require analysis of complex underlying code, which may be beyond the ability of many victims.

Consequently, the report provides high-level recommendations on how existing liability concepts might be applied, with adjustments, to emerging digital technologies, and does not recommend a complete overhaul of liability rules in the EU.

More specifically, key findings from the NTF include:

  • It is not necessary to give autonomous systems a separate legal personality for liability purposes.  In particular, this would require attributing funds to such a system, effectively putting a cap on liability, which may be inadequate to meet all compensation claims.
  • In certain circumstances, it may be appropriate to impose strict liability for damage caused by emerging technologies (e.g., liability without any finding of fault).  However, this should be limited to situations where technology may create comparable risks to those subject to strict liability under existing frameworks.  This is likely to arise primarily in circumstances where digital technologies move in public spaces (e.g., drones, autonomous cars).
  • Vicarious liability should be imposed on operators of technologies in line with existing vicarious liability regimes.  For example, a company might be vicariously liable where an autonomous tool causes harm that, if caused by a human employee / auxiliary, would normally give rise to vicarious liability.  The NTF also states that once autonomous technology capabilities outperform humans, there should be a higher expected standard of those technologies, and dropping below that higher standard would lead to vicarious liability for the operator.
  • The burden of proof for both causation and damage should, as a general rule, be on the victim (as is the case under current liability frameworks).  In certain circumstances, however, it may be appropriate to reverse or lower this burden, particularly where such a burden would be disproportionately high.  In particular, the burden of proof should be reversed if damage was caused by a breach of some other set of rules (e.g., rules on information security).
  • Where multiple people or companies cooperate to provide different elements of some technology (e.g., if they coordinate marketing or have a degree of technical interdependence), they should be jointly or severally liable for harms caused by that system.
  • New duties of care may need to be developed, in particular on operators of technologies to choose appropriate systems to achieve their goals and to monitor the use of those systems, as well as on producers of those technologies to design products that enable operators to comply with their duties and to monitor products after they go into circulation.
  • Insurance may, in the future, need to be mandatory for certain technologies if the potential harm is more frequent or more severe, and if operators in question are unlikely to be able to provide redress to individuals (e.g., for startups with limited capital for redress).

At this stage, the report does not provide specific recommendations as to how these recommendations should be implemented into EU or national laws.  However, the Expert Group also incorporates a separate Product Liability Directive Formation (“PLDF”), whose task is to provide expertise on the applicability of Directive 85/374/EEC (the Product Liability Directive) to these new technologies.  Going forward, therefore, further reports from the NTF, or publications from the PLDF, are likely to provide more concrete recommendations as to how, if at all, laws at an EU level should be changed to ensure that liability mechanisms are fit for purpose in the context of AI and other developing technologies.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lisa Peets Lisa Peets

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice embraces regulatory compliance and investigations alongside legislative advocacy. In this…

Lisa Peets is co-chair of the firm’s Technology and Communications Regulation Practice Group and a member of the firm’s global Management Committee. Lisa divides her time between London and Brussels, and her practice embraces regulatory compliance and investigations alongside legislative advocacy. In this context, she has worked closely with many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU and UK legal frameworks affecting technology providers, including data protection, content moderation, platform regulation, copyright, e-commerce and consumer protection, and the rapidly expanding universe of additional rules applicable to technology, data and online services. Lisa also routinely advises clients in and outside of the technology sector on trade related matters, including EU trade controls rules.

According to Chambers UK (2024 edition), “Lisa provides an excellent service and familiarity with client needs.”

Photo of Marty Hansen Marty Hansen

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues. Martin has extensive experience in advising clients…

Martin Hansen has over two decades of experience representing some of the world’s leading innovative companies in the internet, IT, e-commerce, and life sciences sectors on a broad range of regulatory, intellectual property, and competition issues. Martin has extensive experience in advising clients on matters arising under EU and U.S. law, UK law, the World Trade Organization agreements, and other trade agreements.

Photo of Paul Maynard Paul Maynard

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online…

Paul Maynard is special counsel in the technology regulatory group in the London office. He focuses on advising clients on all aspects of UK and European privacy and cybersecurity law relating to complex and innovative technologies such as adtech, cloud computing and online platforms. He also advises clients on how to respond to law enforcement demands, particularly where such demands are made across borders.

Paul advises emerging and established companies in various sectors, including online retail, software and education technology. His practice covers advice on new legislative proposals, for example on e-privacy and cross-border law enforcement access to data; advice on existing but rapidly-changing rules, such the GDPR and cross-border data transfer rules; and on regulatory investigations in cases of alleged non-compliance, including in relation to online advertising and cybersecurity.