The document MDCG 2025–6 is a joint paper by the European Artificial Intelligence Board (AIB) and the Medical Device Coordination Group (MDCG). It serves as an FAQ document primarily for manufacturers of medical devices with artificial intelligence (MDAI) and addresses the interactions between the Medical Devices Regulation (MDR), the In Vitro Diagnostic Medical Devices Regulation (IVDR) and the Artificial Intelligence Regulation (AI Act).
The aim is to provide guidance to manufacturers, notified bodies and competent authorities on the simultaneous application of these regulations, in particular with regard to high-risk AI systems in medical devices.
What it’s all about
Manufacturers of AI-based medical devices will in future have to comply with the following two complementary sets of regulations in particular:
- MDR/IVDR, which regulate medical safety and clinical effectiveness, and
- AI Act, which imposes additional requirements to ensure fundamental rights, transparency, data quality and risk management in AI systems.
When is an MDAI affected?
As soon as an AI system is a medical device or part of one and must undergo conformity assessment by a Notified Body in accordance with MDR/IVDR, it is also considered a high-risk AI system under the AI Regulation.
MDAIs developed in-house by healthcare institutions are exempt from this, but must still meet certain requirements of the AI Regulation.
What does this mean for manufacturers?
- Quality and risk management
The AI Regulation prescribes a specific quality management system (QMS) for high-risk AI systems – this can be integrated into the existing MDR/IVDR QMS. Risk management must also cover AI-specific risks such as bias, data distortion or unpredictable system behaviour. - Data requirements
AI training, validation and test data must be of high quality, traceable and representative of the target population. Bias prevention and monitoring become mandatory. - Transparency & oversight
Users must be able to recognise that they are interacting with an AI system – explainability of how it works becomes a key criterion. Human oversight is mandatory: systems must not be able to ‘override’ themselves. - Technical documentation
Consistent technical documentation for MDR/IVDR and AI Regulation is expected. In addition to clinical evidence, AI-specific evidence (e.g. performance tests, risk assessments) must also be documented. - Post-market monitoring & changes
Manufacturers must develop integrated monitoring plans – also with regard to AI interactions with other systems.
What does the interplay between MDR/IVDR and AI Regulation mean for manufacturers?
- Two sets of rules – one goal: safety, ethics and trust in AI-based medical devices.
- Compliance will become multidimensional in the future. Companies must adapt to an integrated, regulated lifecycle approach – from development and data responsibility to ongoing market surveillance.
- Acting now means developing processes, documentation and teams in a timely manner to ensure AI Regulation compliance. Only those who address regulatory issues early on will be able to bring MDAI products to market safely, economically and in compliance with the law.
We are happy to support you in this endeavour. Get in touch with us.
back