RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceHealthcare AI
High Risk — Annex III, Point 5Penalty: up to €15M or 3% of turnover

EU AI Act Compliance for Healthcare AI

AI systems used in medical diagnosis, treatment recommendation, patient triage, or clinical decision support are classified as high-risk under Annex III, point 5 of the EU AI Act. This triggers a comprehensive set of mandatory obligations.

Article 99 penalty for high-risk non-compliance

Providing or deploying a high-risk healthcare AI system that does not meet Articles 9–17 obligations carries a penalty of up to €15,000,000 or 3% of global annual turnover, whichever is higher. The deadline is 2 August 2026 — or 2027 for systems embedded in CE-marked medical devices under Annex I.

Not sure whether your clinical AI system falls under Annex III?

Describe what your system does and Regumatrix checks it against Annex III, Article 6, and the full regulation — in about 30 seconds. Your first 3 analyses are free.

Check my healthcare AI system — 3 free analyses included

Affected systems under Annex III(5)

  • AI-assisted diagnostic imaging and pathology tools
  • Clinical decision support systems (CDSS)
  • Patient triage and risk stratification tools
  • AI for monitoring vital signs or predicting deterioration
  • Surgical robotics with autonomous decision capability
  • Predictive tools for treatment selection or drug dosing

Mandatory obligations for healthcare AI providers

Art. 9Maintain a documented risk management system throughout the AI lifecycle
Art. 10Ensure training data is relevant, representative, and free from errors
Art. 11Prepare and keep technical documentation before market placement
Art. 13Ensure transparency and provide instructions for use to deployers
Art. 14Design for effective human oversight — overridable by qualified staff
Art. 15Achieve appropriate accuracy, robustness, and cybersecurity
Art. 17Implement a quality management system covering the full AI lifecycle
Art. 43Undergo conformity assessment (notified body or internal control)
Art. 49Register the AI system in the EU database before deployment

Are you a deployer (e.g. hospital)?

Hospitals and clinical settings deploying third-party AI tools are deployers under Article 3(4). You still carry obligations under Articles 26 (fundamental rights impact assessment), 29 (use as intended), and 30 (logging and post-deployment monitoring). The provider's CE marking and technical documentation do not eliminate your own obligations.

Key compliance deadlines

Aug 2025Prohibited AI practices ban in force
Aug 2026High-risk obligations fully applicable — including healthcare AI
Aug 2027GPAI model rules in force

Could your healthcare AI be in a grey area?

Many healthcare AI products are described as “decision support” tools but in practice drive clinical decisions. The EU AI Act targets both the design intent and the real-world effect. If you recognise any of the following, get a precise analysis:

  • Your tool is described as informational but clinicians rely on its output without independent review
  • You use a third-party AI model embedded in a product sold to hospitals
  • Your system was CE-marked before the AI Act applied and hasn't been reassessed
  • You collect patient data for AI training without a published data governance policy
  • Your system outputs confidence scores but doesn't flag uncertain cases for forced human review
Check my clinical AI system →

Frequently asked questions

Is medical AI always high-risk under the EU AI Act?▾
Not always. Diagnostic AI, clinical decision support systems, and AI that directly influences treatment decisions are classified as high-risk under Annex III point 5 or under Article 6(1) if they are safety components of regulated medical devices. Administrative AI (scheduling, billing) or purely informational tools for clinicians may fall into limited or minimal risk categories.
How does the EU AI Act interact with the Medical Device Regulation (MDR)?▾
Medical AI that qualifies as a medical device (or IVD) under MDR/IVDR is high-risk under Article 6(1) of the EU AI Act. These systems must comply with both regulatory frameworks: CE marking under MDR/IVDR and conformity assessment under the AI Act. The AI Act conformity assessment can be integrated into existing MDR procedures to reduce duplication.
What does human oversight mean for AI-assisted diagnosis?▾
Article 14 of the EU AI Act requires that high-risk AI systems be designed so a qualified person can oversee, interpret, override, or stop the system's output. For diagnostic AI, this means clinicians must retain ultimate decision authority — the AI provides a recommendation, not a binding conclusion — and the system must display its confidence level and flag unusual or uncertain cases for human review.

Related compliance guides

Is my AI high-risk? (Checklist)HR & Recruitment AI obligationsFinancial services AI guideProvider obligations checklistConformity assessment guideAll enforcement deadlines

Know your exact obligations before August 2026

Healthcare AI is among the highest-scrutiny sectors under the EU AI Act. If your system touches diagnosis, treatment recommendations, or patient triage, you need a precise analysis — not a general checklist.

Describe your AI system in plain language. Regumatrix checks it against every article of the EU AI Act and returns your risk tier, Annex classification, the exact obligations that apply, and your fine exposure under Article 99. Eight sections. About 30 seconds.

Analyse my healthcare AI free →All compliance guides

8-section report · Article citations · ~30 seconds · No credit card