A brief analysis of the interaction between the proposed AI Act and the MDR

The AI Act continues to contain a concerning flaw as regards the testing of any high risk AI System that also qualifies as a medical device under the Medical Devices Regulation1 (MDR) or the In-Vitro Diagnostics Medical Device Regulation2 (IVDR).

This note focuses on the requirements applicable to general medical devices under the MDR rather than the IVDR. However, the same concepts generally also apply to the approval and testing of in-vitro diagnostic medical devices.

AI Medical Devices

Where product that qualifies as an AI System also qualifies as a medical device (or where an AI System is intended to be used as a safety component of a medical device), that product qualifies as a high risk AI System (as well as a medical device). For ease of reference, we shall call such a device an AI Medical Device or AIMD.

Conformity Assessment

AIMD products must undergo a Conformity Assessment as regards the requirements of both the AI Act and the requirements of the MDR (or the IVDR)3. The Conformity Assessment is the final step before the CE Mark can be affixed to a device and the device can be placed on the market or put into service in the EU. The Conformity Assessment must be conducted before the AIMD can be placed on the market or put into service in the EU4.

In the overwhelming majority of cases, the Manufacturer of a medical device must conduct a clinical investigation in respect of the device in order to gather clinical data regarding the safety and performance of the device in real world conditions5.

Conducting a Clinical Investigation of a Medical Device

It is worth noting that clinical investigations of medical devices are tightly regulated. One can only conduct a conventional clinical investigation6 of an investigation device:

(a) after receiving:

(i) an authorisation from a competent authority for medical devices; and

(ii) a positive opinion from a research ethics committee;

(b) in accordance with:

(i) good clinical practice and other applicable laws, such as the GDPR; and

(ii) a formal protocol and a clinical investigation plan; and

(c) with the informed consent of the participants; and

(d) if the sponsor has measures (often insurance) in place to compensate any participant for any damage resulting from participation in the clinical investigation.

Finally, any adverse events must be promptly reported to competent authorities.

A Clinical Investigation of an investigational device does not constitute making available

The MDR explicitly allows the Manufacturer to conduct a clinical investigation of a device before the device undergoes its Conformity Assessment. This is achieved by explicitly exempting the use of an investigational device (being a device undergoing a clinical investigation) from the definitions of making available, placing on the market or putting into service.

Unfortunately the AI Act does not contain equivalent exemptions.

The Challenge

Consequently, the person7 developing a novel AIMD is faced with the following challenge when undertaking clinical testing of a novel AIMD.

  • First, such clinical testing is effectively mandatory under both the MDR and the IVDR.
  • Second, such clinical testing must be undertaken prior to undergoing a Conformity Assessment which in turn is conducted to enable the AIMD product to be CE Marked and made available in the EU.
  • Third, such clinical testing is explicitly permitted under the MDR; provided one complies with the various requirements mentioned above.
  • Fourth, such clinical testing would appear to fall foul of the AI Act.

The easiest solution would have been to include in the AI Act a similar exclusion in the definition of making available, placing on the market or putting into service.

The Provisions of the AI Act as regards Real World Testing

There are some provisions in the AI Act that might enable such testing. However, as discussed below, these are either flawed or problematic.

The AI Act mandates the testing of high-risk AI systems to identify appropriate and targeted risk management measures (weighing such measures against the potential benefits and intended goals of the system)8. Testing must be designed to ensure that each high-risk AI system performs consistently for its intended purpose and complies with the statutory requirements9. Such testing of the high-risk AI systems shall be performed prior to the placing on the market or the putting into service10.

The challenge is that the AI Act does not exclude clinical testing of an AI System from the terms "placing on the market" or "putting into service". Accordingly, the use of a high-risk AI System in a clinical investigation of an investigational device would not constitute "testing" of such AI System for these purposes.

The AI Act11 clarifies that "testing in real world conditions shall not be considered as placing the AI System on the market or putting it into service... provided that all conditions under Article 53 or 54a are fulfilled."

The AI Act exempts from the scope of the AI Act any testing of an AI System conducted before the device is placed on the market or put into service12. That Article then adds that "testing in real world conditions shall not be covered by this exemption".

Taken together, these provisions appear to provide that when an AI System is tested in real world conditions, such testing will constitute a placing on the market event or a putting into service event unless such testing complies with the regulatory sandbox requirements in Article 53 or the alternative provisions in Article 54a.

Testing in an AI Regulatory Sandbox – Art. 53

The AI Act includes provisions13 that compel each Competent Authority to establish (and resource) at least one AI regulatory sandbox at national level. Each sandbox is intended to be a "controlled environment" maintained by the competent authority for AI that:

"facilitates the development, training, testing and validation of innovative AI systems for a limited time before their placement on the market or putting into service pursuant to a specific sandbox plan agreed between the prospective providers and the competent authority. Such regulatory sandboxes may include testing in real world conditions supervised in the sandbox."

The competent authority must provide an "exit report" on conclusion of the regulatory sandbox study, which may be used to demonstrate compliance with the AI Act during the Conformity Assessment14.

One could argue that testing in a regulatory sandbox is similar to an authorised clinical investigation of an investigational device. As such, in theory, one could ask the AI Competent Authority to allow the clinical investigation for the purposes of the MDR to also service as the regulatory sandbox study for the purposes of the AI Act. However, these are two fundamentally different concepts and we are sceptical that the two regulatory regimes will allow such a pragmatic solution.

Hence we turn to the next alternative, Article 54a.

Testing of high risk AI Systems in real world conditions outside regulatory Sandbox – Art. 54a

At first blush, Article 54a appears attractive in that it provides for the Testing of high-risk AI systems in real world conditions outside AI regulatory sandboxes. This is precisely what a clinical investigation is intended to be. This would appear to be exactly the kind of safe harbour on which that a person conducting a clinical investigation of an AIMD would like to rely. Unfortunately, Article 54a does not appear to extend to AIMD products (which are listed in Annex II – Part I). Rather, it appears to only extend to providers of high-risk AI Systems listed in Annex III15.

Sequencing and Duplication

It would be bizarre if it was necessary to undertake two distinct and different testing processes to demonstrate that an AIMD complies with (a) the MDR (namely a clinical investigation); and (b) AI Act (namely participation in an AI regulatory sandbox).

Likely outcomes

In our view, unless this issue is promptly resolved, the ambiguity caused by the drafting of the AI Act will likely mean that the clinical testing (whether a clinical investigation or testing in real world conditions or both) of novel AIMDs will likely be conducted outside the EU.

Defined Term (MDR Art)
Clinical data (2(48))

means information concerning safety or performance that is generated from the use of a device and is sourced from the following:

  • clinical investigation(s) of the device concerned,
  • clinical investigation(s) or other studies reported in scientific literature, of a device for which equivalence to the device in question can be demonstrated,
  • reports published in peer reviewed scientific literature on other clinical experience of either the device in question or a device for which equivalence to the device in question can be demonstrated,

clinically relevant information coming from post-market surveillance, in particular the post-market clinical follow-up

Clinical Investigation (2(47)) means any systematic investigation involving one or more human subjects, undertaken to assess the safety or performance of a device
Investigational device (2(45)) means a device that is assessed in a clinical investigation
Manufacturer (2(30)) means a natural or legal person who manufactures or fully refurbishes a device or has a device designed, manufactured or fully refurbished, and markets that device under its name or trademark
Making available on the market (2(27)) means any supply of a device, other than an investigational device, for distribution, consumption or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge
Placing on the market (2(28)) means the first making available of a device, other than an investigational device, on the Union market
Putting into service (2(29)) means the stage at which a device, other than an investigational device, has been made available to the final user as being ready for use on the Union market for the first time for its intended purpose
Defined Term (IVDR Art)
Device for Performance study (2(45))

'device for performance study' means a device intended by the manufacturer to be used in a performance study.

A device intended to be used for research purposes, without any medical objective, shall not be deemed to be a device for performance study;

Making available on the market (2(20)) means any supply of a device, other than a device for performance study, for distribution, consumption or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge;
Manufacturer (2(23)) means a natural or legal person who manufactures or fully refurbishes a device or has a device designed, manufactured or fully refurbished, and markets that device under its name or trade mark;
Performance study (2(42)) means a study undertaken to establish or confirm the analytical or clinical performance of a device;
Placing on the market (2(21)) means the first making available of a device, other than a device for performance study, on the Union market;
Putting into service (2(22)) means the stage at which a device, other than a device for performance study, has been made available to the final user as being ready for use on the Union market for the first time for its intended purpose;


Footnotes

1. Regulation (EU) 2017/745 on medical devices.

2. Regulation (EU) 2017/746 on in vitro diagnostic medical devices.

3. Article 43(3) of the AI Act.

4. Article 16(e) of the AI Act and Article 5(1) of the MDR (or Article 5(1) of the IVDR in respect of an IVD).

5. This arises from the narrow definition of "clinical data" at Article 2(48) of the MDR and the approach adopted by most Notified Bodies. While the text in the IVDR is different, it contains a rebuttable presumption in Section 1.2.3 of Annex XIII of the IVDR that the manufacturer must conduct a performance evaluation of the proposed device: Clinical performance studies shall be performed unless due justification is provided for relying on other sources of clinical performance data.

6. See Chapter VI of the MDR.

7. The person with primary regulatory responsibility for the product is known as the manufacturer under the MDR (Article 2(23)) and the provider under the AI Act (Article 3(2)).

8. Article 9 of the AI Act.

9. Article 9(5) of the AI Act:
High-risk AI systems shall be tested for the purposes of identifying the most appropriate and targeted risk
management measures. Testing shall ensure that high-risk AI systems perform consistently for their intended purpose and they are in compliance with the requirements set out in this Chapter.

10. Article 9(7) of the AI Act.

11. Article 3(bi) of the AI Act. See also Article 9(6) of the AI Act: testing procedures may include testing in real world conditions in accordance with Article 54a.

12. Article 2(5b) of the AI Act.

13. Article 53 of the AI Act.

14. Article 53(1f) of the AI Act.

15. AI Systems in any of the following areas: Biometrics; Critical infrastructure; Education; Employment; Access to and enjoyment of essential private and public services; Law enforcement; Migration; Administration of Justice.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.