Medtech Insight is part of Pharma Intelligence UK Limited

This site is operated by Pharma Intelligence UK Limited, a company registered in England and Wales with company number 13787459 whose registered office is 5 Howick Place, London SW1P 1WG. The Pharma Intelligence group is owned by Caerus Topco S.à r.l. and all copyright resides with the group.

This copy is for your personal, non-commercial use. For high-quality copies or electronic reprints for distribution to colleagues or customers, please call +44 (0) 20 3377 3183

Printed By

UsernamePublicRestriction

EU AI Act Legal Deep-Dive Part 1: What Can Medtech Expect, And When?

Executive Summary

The text of the European Commission’s proposed AI Act is expected to enter the EU high-level negotiation stage within weeks, yet many questions remain as to what the final regulation will entail. Lawyers from Cooley LLP tell Medtech Insight why considerable changes to the text are possible, and what the new law is likely to mean for medtech.

Last month, members of the European Parliament voted in favor of substantial changes to the draft AI Act – a key step towards this landmark legislation being adopted in the EU.

The compromise amendments set out significant updates to the original proposal – although many of these changes are not directly relevant to medical device and IVD manufacturers.

They do, however, signify the high level of debate this proposed legislation has provoked, and explain why so many uncertainties remain around how the final text will look and when it might enter into force (Also see "EU AI Act Amendments Get Green Light From Parliament Committees In Latest Vote" - Medtech Insight, 11 May, 2023.)

The fact that so much of the AI Act does not directly apply to medical devices is itself is a challenge for medtech manufacturers; with a scope so vast, the implications of the AI Act for the health sector may not be immediately obvious. Medtech Insight, in collaboration with expert lawyers at Cooley, outlines in this article the following top four AI Act issues that medtech firms must be aware of:

  • What legislative stage the AI Act is currently at and when it might apply;

  • Which medical devices/IVDs are included in the scope of the AI Act and how product risk classifications may differ from MDR/IVDR rules;

  • Overlap with other regulations; and

  • The implications of the AI Act for notified bodies.

These insights come with the important caveat, the Cooley lawyers have said, that there is “considerable scope” for the draft text to change before it is adopted.. This article is the first of a two-part series. In the second piece, Cooley’s legal experts explore: What the AI Act means for conformity assessments; how the AI Act may affect patient access to innovation; areas for improvement within the proposed text; and compare the EU approach to governing AI to the UK’s light-touch framework.

Key Takeaways

  • The AI Act could be finalized and adopted in mid-to-late 2024, but this is uncertain as final negotiations could take more than a year to conclude.

  • Most medical devices and IVDs that fall within the AI Act scope will be classed as high-risk products, even if they are not high-risk under the MDR/IVDR.

  • Guidance and standards should clarify how notified bodies interpret the AI Act, helping to mitigate the risk of unpredictable conformity assessment outcomes for manufacturers.

When Will The AI Act Apply?

The European Parliament is expected to adopt its position on the AI Act text this month, after which trilogue discussions will commence. Trilogues are the last negotiation stage between the Council of the EU, parliament, and commission, in which they firm up details of draft legislations and eventually agree on a final version of the legislation.

Cooley lawyers Elizabeth Anne Wright and Edward Turtle told Medtech Insight that “on balance,” they anticipate trilogue negotiations to conclude by spring 2024 meaning that the AI Act details could be finalized within a year, but there is no guarantee of this.

Indeed, they said, there is little certainty on when the AI Act will be adopted and enter into force across the EU, as this will depend not only on how long it takes for the commission, council and parliament to reach an agreement on the final text, but also on the length of the transition period agreed for compliance with the new rules.

There have been reports, according to Wright and Turtle, that a lead AI Act rapporteur from the European Parliament's Committee on Civil Liberties, Justice and Home Affairs (LIBE) predicted trilogues could take 18 months to come to an end.

“That is long by normal standards, but the AI Act has proved controversial so far and it took two years for parliament to agree its position internally,” the lawyers commented. “On the other hand, there is considerable desire on the part of the commission and council to reach an agreement faster than this, ideally before the European Parliament elections in the middle of next year.”

If negotiations extend beyond October 2024, when the mandate of the current commission ends, there could be further implications to the AI Act being adopted, they added.

The proposed transition period contained in the AI Act draft text, meanwhile, is currently set to two years - something that Wright and Turtle said the parliament also appears to support. However, the Counci of the EU’s general approach includes an amendment to extend the transition period to three years.

“Overall, it is therefore possible that we will see the AI Act adopted sometime between mid-to-late 2024. Depending on the related transitional provisions that are agreed, the Act would start to apply between mid-to-late 2026,” they said.

Most AI Medtech Will Be Considered High-Risk

One of the most critical aspects of the AI Act text for medical device companies is its definition of “high-risk” AI systems.

As explained by Cooley lawyers Wright and Turtle, if the commission’s proposal is adopted in its current form, the AI Act “will view all medical devices/diagnostics that fall within its scope as high-risk AI systems.” This would be with the exception of products where, under the MDR/IVDR, a notified body conformity assessment is not required for that low-risk device or diagnostic to be placed on the EU market.

Likely only medical devices that are class IIa or higher and diagnostics that fall within classes B, C, and D will be included in the AI Act as it stands, because AI medical devices are typically software and therefore subject to being class IIa or higher under Rule 11 of the MDR, Wright and Turtle said.

This means, that, “in practice, high-risk classification may become common if the AI Act is adopted in the form proposed by the commission,” they explained.

Overlap With MDR/IVDR

Wright and Turtle confirmed that medical devices and IVDs which fall within the scope of the AI Act and the MDR/IVDR will need to comply with both regimes, but said it is “currently unclear” just how challenging it will be for manufacturers to demonstrate compliance with the AI Act in practice. Considerable overlap is expected between the MDR/IVDR and AI Act in some areas, they note, such as in implementing quality and risk-management systems.

“This concern has been raised by a number of stakeholders,” Wright and Turtle noted, and the Council of the EU sought to make amendments to address these concerns in its common position, adopted in December 2022.

“However, most stakeholders consider that the Council of the EU proposals did not fully address these concerns. We will need to wait to see how this is addressed in the agreed text after trilogue negotiations to be sure of the approach,” they noted.

Trade body MedTech Europe was one such stakeholder that said the council’s common position did not sufficiently address such concerns. (Also see "Medtech Industry Says Latest EU AI Act Revisions Insufficient As Questions Remain" - Medtech Insight, 9 Dec, 2022.).

More Regulatory Overlap

Adding to the insights provided by Wright and Turtle, Cooley’s Patrick Van Eecke and Daniel Millard reflected on how the AI Act may interact with other horizontal EU legislations, such as the General Data Protection Regulation (GDPR).

“There is certainly significant overlap between the AI Act and other horizontal regulations at EU level. However, the AI Act is designed to complement those regulations and create harmonized rules applicable to the design, development, and use of AI systems,” Van Eecke and Millard observed.

Elizabeth Anne Wright is a partner at Cooley who specializes in EU medical device and pharma regulation.
Edward Turtle is a Cooley products lawyer, regulatory advisor and litigator with experience in the technology, healthcare and consumer product sectors.
Patrick Van Eecke is a partner at Cooley who co-chairs the law firm’s global cyber/data/privacy practice.
Daniel Millard is a Cooley associate lawyer who specializes in transactional, regulatory and compliance challenges relevant to data privacy and security.

For example, they said, the AI Act “explicitly states” that rules relating to transparency should be used to aid compliance with GDPR obligations around data protection impact assessments, in the case of high-risk AI systems.

“Further, the Act makes clear that participation in the AI regulatory sandbox can provide a legal basis for processing personal data under the GDPR if certain conditions in the Act are met – including, for example, where AI systems are developed for public health purposes,” Van Eecke and Millard said.

They added that the “complementarity of the AI Act and the EU’s broader digital strategy is central to ensuring legal certainty and the development of an ecosystem of trust in AI in Europe.”

What About Notified Bodies?

Given that medtech companies have reported difficulties in predicting the clinical evidence expectations that notified bodies have in the context of the MDR/IVDR, particularly for high-risk products, Medtech Insight questioned whether this same problem might arise during conformity assessments under the AI Act.

Wright and Turtle said that although the commission has taken a “highly regulated” approach with the AI Act, “much of the detail will still be amplified by standards and guidance.”

The proposal "includes a role for the commission to be tasked with creating guidance on the application of the AI Act and for standards to be developed for all high-risk AI systems to allow for implementation of the requirements imposed by the Act.”

This approach is modeled on the existing EU regulatory regime governing product safety, the lawyers continued, a regime which is based around reliance on harmonized standards to demonstrate conformity with requirements set out in framework legislation.

“This approach is not perfect, and there is always scope for different interpretations of regulatory requirements and underlying standards. However, it is a proven approach that is commonly considered to strike a good balance between efficiency of legislation, practical clarity around granular requirements and flexibility to respond to change through updates to standards,” Wright and Turtle contended.

“So, in principle, this may be an appropriate approach for AI technologies; albeit that, inevitably, given the pace of change in AI, the ability for the framework and standards to respond to developments will be tested.”

Related Content

Topics

Latest Headlines
See All
UsernamePublicRestriction

Register

MT147979

Ask The Analyst

Ask the Analyst is free for subscribers.  Submit your question and one of our analysts will be in touch.

Your question has been successfully sent to the email address below and we will get back as soon as possible. my@email.address.

All fields are required.

Please make sure all fields are completed.

Please make sure you have filled out all fields

Please make sure you have filled out all fields

Please enter a valid e-mail address

Please enter a valid Phone Number

Ask your question to our analysts

Cancel