AI used as a safety component in a medical device regulated under the Medical Devices Regulation (MDR, Reg 2017/745) or the In Vitro Diagnostics Regulation (IVDR, Reg 2017/746) is high-risk under Article 6(1) of the EU AI Act. This is a different — and more demanding — pathway than Annex III. Conformity assessment follows the MDR/IVDR notified body route, with AI Act Chapter III Section 2 requirements added on top. COM(2025) 836 proposes extending the compliance deadline to 2 August 2028.
Art 6(1) classification: full Chapter III obligations from 2 August 2027
Violations of high-risk AI obligations under Chapter III carry the Art 99(4) penalty: up to €15 million or 3% of global annual turnover, whichever is higher. Non-compliance with the MDR/IVDR conformity assessment procedure — including its AI Act dimensions — also constitutes a breach of those sectoral regulations with separate enforcement consequences.
Is your medical AI high-risk under Art 6(1) or Annex III?
Regumatrix maps your system description against all classification rules — including the Art 6(1) Annex I pathway — and identifies your exact conformity route, obligation chain, and fine exposure in a cited 8-section report in about 30 seconds.
Check in 30 seconds — 3 free analysesMost EU AI Act high-risk content focuses on Art 6(2) and Annex III — the eight sector-specific domains. But Article 6(1) is a separate, parallel pathway for AI embedded in products that already undergo mandatory third-party conformity assessment under EU product safety law.
Article 6(1) — HIGH-RISK (two conditions must both be met)
(a) The AI system is intended to be used as a safety component of a product — or is itself a product — covered by Union harmonisation legislation listed in Annex I of the EU AI Act; AND
(b) that product is required to undergo a third-party conformity assessment under that Annex I legislation.
Relevant Annex I entries for medical AI
In scope — typical SaMD examples
Outside Art 6(1) scope — may still trigger Annex III
Note: AI systems that fail the Art 6(1) test — because they are not safety components, or because the device requires no third-party assessment — may still be high-risk under Annex III §5(a) (healthcare AI — AI used to make, assist, or influence clinical decisions, triage, or similar). Always check both pathways.
Art 43(3) establishes the key rule: for AI systems covered by Annex I Section A legislation (MDR, IVDR, and others), the provider must follow the conformity assessment procedure required under that sectoral legislation — not the AI Act’s own internal control or Annex VII procedure. The MDR/IVDR notified body assessment therefore incorporates the AI Act requirements.
What Art 43(3) requires
Art 8(2) — documentation integration (reduces duplication)
Where a product contains an AI system subject to both Annex I harmonisation legislation (MDR/IVDR) and AI Act requirements, providers may integrate AI Act testing, reporting, and documentation into their existing MDR/IVDR technical file and procedures. You do not need a parallel, separate AI Act documentation package — incorporate it into the technical file you already maintain.
836 Omnibus clarification: COM(2025) 836 proposes a revised Art 43(3) that also explicitly allows MDR/IVDR notified bodies to assess quality management systems under the AI Act as part of the MDR/IVDR QMS audit. Where a system is covered by both Annex I and Annex III, the Annex I (MDR/IVDR) procedure takes precedence.
Whether assessed via MDR, IVDR, or both, the provider of a high-risk medical AI must satisfy all Chapter III Section 2 obligations before market placement.
Risk Management System
Establish and maintain a risk management system throughout the AI system's lifecycle. For medical device AI, this integrates with the MDR/IVDR risk management requirements under ISO 14971. Document all foreseeable risks to health and safety, including failure modes specific to AI models (dataset shift, distributional mismatch, adversarial inputs).
Data Governance
Training, validation, and test data must be relevant, representative, and free from harmful bias. For medical AI, this means demonstrating performance across patient demographics (age, sex, ethnicity, comorbidities). Medical imaging AI must show performance is not degraded by differences in scanner models or acquisition protocols.
Technical Documentation (Annex IV)
Comprehensive AI-specific documentation including system design, architecture, intended purpose, known limitations, and validation performance. This can be integrated into the MDR/IVDR technical file under Art 8(2). Must document the AI model, training methodology, and post-market monitoring plan.
Record-Keeping & Logging
Automatic event logging for the duration of the AI system's lifetime placed on market — or at minimum the log retention period defined by applicable MDR/IVDR rules. Logs must allow reconstruction of clinical recommendations and identify input data used.
Transparency & Instructions for Use
The IFU must clearly describe AI capabilities, known performance limitations (especially across population subgroups), the nature and format of input data, and how outputs should be interpreted. Radiologists or clinicians using the AI must understand its intended operating conditions.
Human Oversight
The system must allow appropriately qualified clinical professionals to understand, oversee, and override AI outputs. For diagnostic AI, this means the output must function as a decision-support tool — the clinician retains final responsibility. Automated systems that bypass human review are a red flag under both the AI Act and MDR Good Clinical Practice.
Accuracy, Robustness & Cybersecurity
Must maintain performance across the input domain including edge cases. For medical AI, performance against the intended patient population must be validated on a sufficiently diverse test set. Cybersecurity is particularly critical — manipulation of AI inputs in medical settings could cause patient harm.
Article 113(c) — deferred deadline for Art 6(1) systems
Article 113 of the EU AI Act grants a deferred deadline specifically for Art 6(1) systems: "Article 6(1) and the corresponding obligations in this Regulation shall apply from 2 August 2027" — one year after the general August 2026 application date. This deferred deadline acknowledges the additional complexity of the MDR/IVDR conformity route.
What changes in the period August 2025 – August 2027
Chapter III Section 4, Chapter V, Chapter VII, and Chapter XII apply from 2 August 2025. This means notified body accreditation, market surveillance notification frameworks, and governance structures are in place. New GPAI obligations are also live from August 2025. The Article 9–15 obligations — the substantive high-risk AI requirements — apply from 2 August 2027 for Art 6(1) systems.
The Omnibus Simplification proposal COM(2025) 836 adds a new point (d) to Article 113 that fundamentally changes the compliance timeline for all high-risk AI — including medical device AI.
New Art 113 point (d) — proposed mechanism
Chapter III Sections 1, 2, and 3 (the high-risk AI classification and obligation rules) shall apply 12 months after a Commission decision confirming that adequate compliance support measures (harmonised standards, common specifications, guidelines) are available for Annex I systems. In the absence of that decision within the required timeframe — or where the resulting date is later — the obligations apply from 2 August 2028 for systems classified as high-risk under Art 6(1) and Annex I.
Current law deadline
2 August 2027 (16 months away)
Art 113(c) — already enacted
Proposed fallback deadline
2 August 2028 (28 months away)
COM(2025) 836 — pending agreement
Practical implication
If COM(2025) 836 is enacted, medical device AI providers would gain an additional year of preparation time before substantive AI Act obligations apply. However, MDR/IVDR obligations continue unaffected — only the AI Act deadline shifts. Providers should not pause MDR/IVDR quality system work in anticipation of the 836 deadline extension. The 836 also clarifies the Art 43(3) conformity route for Annex I Section A systems and enables QMS integration between MDR/IVDR and AI Act requirements.
Dual compliance complexity: MDR/IVDR + AI Act
Common grey areas in medical AI classification:
Regumatrix analyses your system, identifies whether you face the Art 6(1) pathway or Annex III, maps your MDR/IVDR conformity route, and lists every Chapter III Section 2 obligation — cited to the exact article.
Start free — 3 analyses included