On April 22, 2024, the European Federation of Pharmaceutical Industries and Associations (“EFPIA”) issued a statement on the application of the AI Act in the medicinal product lifecycle. The EFPIA statement highlights that AI applications are likely to play an increasing role in the development and manufacture of medicines.  As drug development is already governed by a longstanding and detailed EU regulatory framework, EFPIA stresses that care should be taken to ensure that any rules on the use of AI are fit-for-purpose, adequately tailored, risk-based, and do not duplicate existing rules.  The statement sets forth five “considerations”:

1. R&D AI qualify under the AI Act’s research exemption

The AI Act does not apply to AI systems and models developed and put into service solely for scientific research purposes.  Accordingly, the exemption should encompass AI-based drug development tools used in research and development, as that is their sole use. 

2. Other R&D AI generally not “high risk”

AI systems and models used in the research and development of medicines that do not fall under the exemption should not be considered high-risk AI, as they generally do not satisfy the criteria for “high risk” AI systems set forth in Article 6 of the AI Act.

3. No need for additional regulation of R&D AI

The development of medicines in Europe is already subject to an intricate set of very detailed rules and regulations in Europe.  This regulatory system should suffice to also address the use of AI in the development of medicines, without the need for additional regulation.

4. The European Medicines Agency (EMA) expected guidance is welcome

EFPA welcomes the EMA’s efforts to assess the impact of AI in R&D, such as in its consultation on a draft reflection paper and multi-annual work plan, and its emphasis on a “risk-based” approach.  This existing regulatory framework should be able to tackle any concerns related to AI in the development of medicines. 

5. R&D AI governance should be calibrated to its context

Finally, the EFPIA statement points out that AI regulation should remain flexible in order to keep pace with technological development, but should also be able to adapt to the different contexts in which it is applied, including the relevant stage of a product’s development, its impact on the risk-benefit analysis of a medicine and the applicable level of human oversight.  Collaboration among all stakeholders concerned should help to ensure that the potential of AI can be unlocked while respecting fundamental rights, safety and ethical principles.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Dan Cooper Dan Cooper

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing…

Daniel Cooper is co-chair of Covington’s Data Privacy and Cyber Security Practice, and advises clients on information technology regulatory and policy issues, particularly data protection, consumer protection, AI, and data security matters. He has over 20 years of experience in the field, representing clients in regulatory proceedings before privacy authorities in Europe and counseling them on their global compliance and government affairs strategies. Dan regularly lectures on the topic, and was instrumental in drafting the privacy standards applied in professional sport.

According to Chambers UK, his “level of expertise is second to none, but it’s also equally paired with a keen understanding of our business and direction.” It was noted that “he is very good at calibrating and helping to gauge risk.”

Dan is qualified to practice law in the United States, the United Kingdom, Ireland and Belgium. He has also been appointed to the advisory and expert boards of privacy NGOs and agencies, such as the IAPP’s European Advisory Board, Privacy International and the European security agency, ENISA.

Photo of Kristof Van Quathem Kristof Van Quathem

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty…

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty years and developed particular experience in the life science and information technology sectors. He counsels clients on government affairs strategies concerning EU lawmaking and their compliance with applicable regulatory frameworks, and has represented clients in non-contentious and contentious matters before data protection authorities, national courts and the Court of the Justice of the EU.

Kristof is admitted to practice in Belgium.

Photo of Sarah Cowlishaw Sarah Cowlishaw

Advising clients on a broad range of life sciences matters, Sarah Cowlishaw supports innovative pharmaceutical, biotech, medical device, diagnostic and technology companies on regulatory, compliance, transactional, and legislative matters.

Sarah is a partner in London and Dublin practicing in the areas of EU…

Advising clients on a broad range of life sciences matters, Sarah Cowlishaw supports innovative pharmaceutical, biotech, medical device, diagnostic and technology companies on regulatory, compliance, transactional, and legislative matters.

Sarah is a partner in London and Dublin practicing in the areas of EU, UK and Irish life sciences law. She has particular expertise in medical devices and diagnostics, and on advising on legal issues presented by digital health technologies, helping companies navigate regulatory frameworks while balancing challenges presented by the pace of technological change over legislative developments.

Sarah is a co-chair of Covington’s multidisciplinary Digital Health Initiative, which brings together the firm’s considerable resources across the broad array of legal, regulatory, commercial, and policy issues relating to the development and exploitation of digital health products and services.

Sarah regularly advises on:

  • obligations under the EU Medical Devices Regulation and In Vitro Diagnostics Medical Devices Regulation, including associated transition issues, and UK-specific considerations caused by Brexit;
  • medical device CE and UKCA marking, quality systems, device vigilance and rules governing clinical investigations and performance evaluations of medical devices and in vitro diagnostics;
  • borderline classification determinations for software medical devices;
  • legal issues presented by digital health technologies including artificial intelligence;
  • general regulatory matters for the pharma and device industry, including borderline determinations, adverse event and other reporting obligations, manufacturing controls, and labeling and promotion;
  • the full range of agreements that span the product life-cycle in the life sciences sector, including collaborations and other strategic agreements, clinical trial agreements, and manufacturing and supply agreements; and
  • regulatory and commercial due diligence for life sciences transactions.

Sarah has been recognized as one of the UK’s Rising Stars by Law.com (2021), which lists 25 up and coming female lawyers in the UK. She was named among the Hot 100 by The Lawyer (2020) and was included in the 50 Movers & Shakers in BioBusiness 2019 for advancing legal thinking for digital health.

Sarah is also Graduate Recruitment Partner for Covington’s London office.