top of page

EU AI Act (2024/1689) and EU MDR (2017/745): Breaking the Expensive Myth: Why AI-Powered Medical Devices Under EU MDR Don’t Need EU AI Act Certification – A Detailed Analysis of Regulatory Requirement


1. Introduction: A Looming Compliance Nightmare or a Simple Fix?

With Artificial Intelligence (AI) reshaping healthcare, regulators are racing to keep up. The EU AI Act (2024/1689) introduces a high-risk classification for AI systems, imposing stringent compliance measures on AI applications, particularly those used in healthcare. Meanwhile, the EU MDR (2017/745) and IVDR (2017/746) already establish a robust regulatory framework for AI-driven Software as a Medical Device (SaMD), including risk management, cybersecurity, transparency, and human oversight.


This raises a critical question for medical device manufacturers: Are we required to go through a second regulatory certification process under the AI Act, despite already being certified under the MDR? If the answer is yes, this could cripple AI innovation in Europe, discouraging startups and established manufacturers from developing AI-based healthcare solutions. The fear of unnecessary bureaucratic hurdles is very real, but as we will see, a deeper look into the AI Act itself suggests that double certification is not necessary at all.


2. What the EU AI Act Really Says About AI in Medical Devices

The EU AI Act categorizes AI systems based on risk, and AI-driven medical software is classified as high risk under Article 6(2). High-risk AI systems must meet strict requirements, including:

  • Robust Risk Management Frameworks

  • High Standards for Data Governance and Bias Mitigation

  • Mandatory Transparency and Documentation

  • Human Oversight for Safety and Accountability

  • Cybersecurity and Resilience to Adversarial Attacks


At first glance, these AI-specific requirements seem distinct, but when we compare them to EU MDR certification requirements, a striking overlap emerges. MDR already mandates nearly identical compliance measures through ISO 13485 (quality management), ISO 14971 (risk management), IEC 62304 (software lifecycle processes), and ISO 27001 (cybersecurity in medical software).

If a medical AI device is MDR-certified as Class IIa or higher, it is already compliant with every major requirement of the AI Act’s high-risk classification. The AI Act even acknowledges this in Article 43(3), stating that where an AI system is already assessed under an existing sectoral regulation (like MDR), that assessment can be sufficient for compliance with the AI Act. This is a game-changer - it means AI-based SaMD does not require a separate AI Act certification as long as it’s MDR-certified.


Screenshots of EU AI Act, 2024/1689

Article 43:



3. DG SANTE and Notified Bodies: Even Regulators Agree That Double Certification is not required

Manufacturers aren’t the only ones frustrated by the potential for regulatory duplication. Even EU regulators and notified bodies have acknowledged the issue and are working to prevent unnecessary certification loops. In response to author’s inquiries #3628176 and # 4578176, DG SANTE - the European Commission’s Directorate-General for Health and Food Safety - has confirmed that a regulatory alignment process is underway.


In those two inquiries and correspondences with the author, DG SANTE has stated that MDR-certified medical AI devices WILL NOT require an additional AI Act certification.


The Medical Device Coordination Group (MDCG) is even developing official guidance to clarify this interplay, expected to be published in early 2025. The European Commission has also noted that MDR Notified Bodies are likely to be designated as AI Act Notified Bodies, further streamlining compliance and reducing redundant oversight.


Notified Bodies such as TÜV Rheinland (TRLP), one of the largest medical device certifiers in Europe, have echoed the same viewpoint.


In a Business Review which was made available to the author TÜV Rheinland explicitly states that AI-based medical devices assessed under MDR do not require additional AI Act certification because MDR already covers all the necessary risk controls.


If both the European Commission and Notified Bodies agree that double certification is unnecessary, why should manufacturers be forced into an expensive, redundant process?



4. How MDR Already Satisfies AI Act Compliance Requirements: A Side-by-Side Comparison

To further emphasize why double certification is unnecessary, let’s break down AI Act requirements vs. MDR compliance obligations.


Requirement

EU MDR (2017/745)

EU AI Act (2024/1689)

Overlap?

Risk Management

Required (ISO 14971)

Required

Yes

Cybersecurity

Required (ISO 27001, IEC 62304)

Required

Yes

Transparency & Documentation

Required (Clinical Evaluation, PMS, ISO 13485 QMS)

Required

Yes

Human Oversight

Required (Usability Engineering)

Required

Yes

Robustness & Accuracy

Required (Clinical Validation)

Required

Yes

This direct comparison proves that MDR-certified AI-based SaMD already meets the AI Act’s high-risk classification requirements. Enforcing additional AI Act certification would be redundant, costly, and harmful to the med-tech industry.



5. The Cost of Unnecessary Double Certification: Delays, Innovation Blockades, and Increased Costs

If regulators fail to streamline AI Act certification for medical devices, the consequences could be devastating:

  • AI-Powered Healthcare Innovation Would Stall – Companies may shift focus away from Europe to faster, less bureaucratic markets.

  • Delays in Bringing Life-Saving AI Solutions to Patients – Every extra layer of compliance means slower regulatory approval, potentially delaying AI-driven diagnostics and treatments.

  • Significant Compliance Costs – AI-based SaMD manufacturers would be forced to hire separate regulatory teams, undergo additional audits, and submit redundant technical documentation, increasing costs without adding safety benefits.

6. Conclusion: Duplicate Certification is not required.

The EU AI Act was designed to regulate AI across all industries, but for AI-powered medical devices, regulated already as Software as a Medical Device as per EU MDR 2017/745 and IVDR 2017/746 and MDCG2019-11, already provides a stricter, safer regulatory framework. Forcing manufacturers into a second certification process is unnecessary and counterproductive and as per the above evidence NOT REQUIRED.


With Article 43(3) of the AI Act, support from DG SANTE, and confirmation from Notified Bodies like TRLP, the case is clear—MDR-certified AI devices should not require separate AI Act certification.


By officially clarifying this position, the EU can support AI innovation while maintaining rigorous safety and compliance standards, ensuring that Europe remains a leader in AI-driven healthcare.



References

  • European Commission. (2017). Medical Device Regulation (EU MDR 2017/745).

  • European Commission. (2017). In Vitro Diagnostics Regulation (EU IVDR 2017/746).

  • European Commission. (2024). Artificial Intelligence Regulation (EU AI Act 2024/1689).

  • MDCG 2019-11

  • European Direct Contact Centre, Answer DG Santé, Inquiry ##3628176 and # 4578176

  • European Parliament and Council of the European Union.

  • Food and Drug Administration (FDA). (2017). Digital Health Innovation Action Plan. FDA.gov.

  • Food and Drug Administration (FDA). (2019). Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD). FDA.gov.

  • International Medical Device Regulators Forum (IMDRF). (2013). Software as a Medical Device (SaMD): Key Definitions. [Online]. Available at: http://www.imdrf.org

  • Medizinprodukte-Durchführungsgesetz (MPDG). German Medical Devices Implementation Act, implementing EU MDR in Germany.

  • ISO 13485:2016. International standard for quality management systems in medical devices.

  • ISO 14971:2019. International standard for risk management in medical devices.

  • ISO/IEC 27001:2022 - Information security management systems

  • IEC 62304:2006 - Medical device software — Software life cycle development

  • European Commission. (2017). Medical Device Regulation (EU MDR 2017/745).

  • Confidential information Business Review Report TÜV Rheinland



Disclaimer

I am releasing this paper without exact details of the steps to take either to register and AI as SaMD or integrate as non-SaMD into existing hospital systems as those details are subject to active NDAs. Please anticipate errors and limitations as the regulatory landscape is evolving quickly.

I welcome feedback and evaluations of the content.

THIS PAPER IS NOT LEGAL ADVICE in any form.

Specific references cannot be provided.


Author Contributions

The work of conceptualization, methodology, evaluation, analysis was done by the author. A local, specifically trained (regulations of US, EU Germany and Australia for SaMD, AI and Hospitals) Large Language Model was used to generate Abstract and original draft. The author read, reviewed and approved the final manuscript.


Declaration of Interests

The author is Expert Witness to all German Courts and International Courts for Medical Devices and In-Vitro Diagnostics and has consulted companies and hospitals in regard to Software and AI integration either into EHR Software or as SaMD.





Comments


I send a Newsletter

Thank you for registering. Yours Rudolf

© 2024 Rudolf Wagner

bottom of page