RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
  1. Home
  2. /
  3. Compliance
  4. /
  5. EU AI Act vs DORA
Inter-regulation comparison837 — single-entry incident reportingFinancial sector AI: both apply

EU AI Act vs DORA: Financial AI, ICT Risk Management & Incident Reporting

Financial entities deploying AI for credit scoring, insurance pricing, or fraud detection operate at the intersection of two heavyweight EU frameworks. DORA's ICT risk management, third-party risk, and incident reporting requirements apply alongside the AI Act's Article 9 risk management system and Article 73 serious incident reporting. COM(2025) 837 brings partial relief: DORA Art 19 major ICT incident reports will be submitted via a single ENISA notification platform covering DORA + NIS2 + GDPR — though AI Act Art 73 remains a separate obligation.

DORA

Regulation (EU) 2022/2554 — in force January 2025

  • ▸ICT risk management framework (Arts 5–16)
  • ▸ICT incident classification and reporting (Arts 17–23)
  • ▸TLPT — threat-led penetration testing (Arts 24–27)
  • ▸ICT third-party risk management (Arts 28–44)
  • ▸CTPP designation and ESA oversight framework
  • ▸Information and intelligence sharing (Arts 45–49)
  • ▸Applies January 2025 to all in-scope financial entities
  • ▸Enforced by ECB, ESMA, EBA, EIOPA, national financial supervisors

EU AI Act

Regulation (EU) 2024/1689 — Annex III high-risk from August 2026

  • ▸Annex III §5(b): creditworthiness assessment AI = high-risk
  • ▸Annex III §5(c): life/health insurance pricing AI = high-risk
  • ▸Conformity assessment + CE marking for Annex III AI
  • ▸Art 9 risk management system per AI system
  • ▸Human oversight (Art 14), post-market monitoring (Art 72)
  • ▸Art 73 serious incident reporting to national MSA
  • ▸Provider/deployer documentation chain (Arts 24–26)
  • ▸National MSAs + AI Office enforcement

Side-by-side comparison

AspectDORAEU AI Act
What it regulatesOperational resilience of ICT systems in the financial sector — continuity, performance, integrity, availabilityAI systems — based on risk to health, safety, and fundamental rights of persons
Who it applies toFinancial entities: credit institutions, payment institutions, investment firms, insurance/reinsurance undertakings, crypto-ASPs, central counterparties, trading venues, credit rating agencies — not micro-enterprisesProviders (develop/place AI on market), deployers (use AI in professional context), importers, distributors — regardless of sector
Risk managementArt 8: comprehensive, well-documented ICT risk management framework — covering availability, performance, integrity; updated at least annuallyArt 9: risk management system for each high-risk AI system — established, documented, continuously updated throughout lifecycle; risks to health, safety, fundamental rights
Incident reportingArts 17–19: classify major ICT-related incidents → 4h initial notification + 72h intermediate + monthly final to financial supervisorArt 73: serious AI incident (death, serious harm, significant property/rights breach) → provider notifies national MSA without undue delay
Third-party / supply chainArts 28–44: comprehensive ICT third-party risk framework — register of providers, due diligence, contractual provisions (audit rights, exit strategies), CTPP designationArts 24–26: provider/deployer documentation chain — deployer entitled to AI Act technical documentation, conformity declaration; importer/distributor obligations
Enforcement authorityECB, ESMA, EBA, EIOPA, and national sectoral financial supervisors; ESA Joint Committee for CTPP oversightNational market surveillance authorities; EU AI Office for GPAI models and (under 836) VLOP-integrated AI
Penalty maximumFinancial entities: up to €5M or 1% of average daily global turnover (whichever lower); CTPPs: up to €5M or 1% of average daily global turnover€35M/7% for prohibited AI; €15M/3% for high-risk obligation violations; €7.5M/1.5% for incorrect information
Applicability dateJanuary 2025 — directly applicable to all in-scope financial entitiesFebruary 2025 (prohibitions); August 2026 (most high-risk obligations including conformity assessment for Annex III systems)

Key overlap zones

Financial sector high-risk AI — Annex III §5 ↔ DORA financial entities

AI Act Annex III §5 lists as high-risk: AI systems for creditworthiness assessment (§5(b)) and AI systems for life and health insurance risk assessment and premium-setting (§5(c)). The organisations deploying these systems — banks, FinTech lenders, insurance companies — are precisely the DORA-regulated financial entities. The same AI system triggers full AI Act Chapter III obligations (risk management, conformity assessment, CE marking, human oversight, post-market monitoring) AND must be managed within the financial entity's DORA ICT risk framework. Both obligations apply from August 2026 (AI Act high-risk) and January 2025 (DORA) respectively.

AI providers as ICT third-party service providers — DORA Arts 28–44 ↔ AI Act Arts 24–26

A financial entity deploying a third-party AI SaaS — credit scoring API, fraud detection platform, trading risk AI — treats the AI provider as an ICT third-party service provider under DORA. The financial entity must carry out due diligence on the provider's security, governance, and resilience; the contract must include specific provisions under DORA Art 30 (audit rights, exit strategies, data portability, SLAs). From the AI Act side, the provider must supply technical documentation (Art 11), a Declaration of Conformity (Art 47), and instructions for use (Art 26(6)). In practice, the DORA Art 30 ICT contract should incorporate these AI Act documentation provisions — the overlapping requirements reinforce each other and create a single due diligence exercise.

Risk management frameworks — AI Act Art 9 ↔ DORA Art 8

Both require systematic, documented risk management — but they measure different dimensions. DORA Art 8 covers ICT operational resilience: availability, performance continuity, data integrity, recovery time objectives. AI Act Art 9 covers AI-specific risks: data quality hazards, performance drift, biased outputs, adversarial manipulation, and lifecycle risks to the persons affected. A DORA entity's ICT risk framework should incorporate AI-specific risk vectors (model drift, training data bias, adversarial attacks) alongside ICT resilience metrics. This is not legally required — both frameworks operate separately — but integrated risk registers avoid gaps and reduce compliance overhead.

Incident reporting — DORA Arts 17–19 ↔ AI Act Art 73

DORA major ICT incident reporting (4h/72h/monthly) and AI Act Art 73 serious incident reporting apply independently, potentially to different parties for the same event. A credit scoring AI malfunction that causes automated loan rejections affecting thousands of customers may simultaneously be: a DORA major ICT-related incident (the bank reports to its financial supervisor) and an AI Act serious incident (the AI provider reports to the national MSA). The financial entity and the AI provider may be the same legal entity or different — the reporting obligations do not merge. After 837: DORA + NIS2 + GDPR notifications merge into the ENISA single-entry point. AI Act Art 73 stays separate.

Incident reporting: who reports what, to whom

ObligationWho reportsTo whomTimeline
DORA Art 19Financial entity (bank, insurer, investment firm)Financial sector competent authority (ECB/ESMA/EBA/EIOPA/national)4h initial → 72h intermediate → monthly final
NIS2 Art 23Financial entity (all banks/insurers qualify as NIS2 essential entities)National NIS2 competent authority24h early warning → 72h assessment → 30-day final
AI Act Art 73AI system providerNational market surveillance authorityWithout undue delay
837 single-entry (when operational)Financial entityENISA platform → distributed to DORA + NIS2 + GDPR authorities simultaneouslyVia ENISA; underlying timelines apply
AI Act Art 73 after 837AI system provider (unchanged)National MSA — NOT part of ENISA platformUnchanged — separate obligation

Is your financial entity deploying high-risk AI under Annex III §5?

Regumatrix maps your AI deployments against Annex III §5, DORA ICT risk categories, and 837 incident reporting obligations — generating a combined compliance checklist for both frameworks in a single analysis.

Check my AI Act + DORA obligations — 3 free analyses

COM(2025) 837: DORA incident reporting via single-entry point

PROPOSAL — not yet enacted lawCOM(2025) 837 — DORA Art 19 amendments: join the ENISA reporting platform

Currently, a bank experiencing a major AI-related ICT incident must file: (1) a DORA Art 19 report to its financial supervisor, (2) a NIS2 Art 23 report to its NIS2 authority, and (3) if personal data is involved, a GDPR Art 33 report to the data protection authority — three separate reports for the same incident. 837 collapses this to one.

DORA Art 19(1) amended — mandatory reports via ENISA platform

Financial entities required to report major ICT-related incidents under DORA Art 19(1) must submit those reports via the NIS2 Art 23a ENISA single-entry point once it is operational. The ENISA platform distributes the notification to the DORA competent authority. The underlying reporting timeline (4h/72h/monthly) is unchanged — only the submission channel changes.

DORA Art 19(2) amended — voluntary significant cyber threat notifications

Voluntary notifications of significant cyber threats under DORA Art 19(2) are also submitted via the ENISA single-entry point. Financial entities that observe significant cyber threats (below the major incident threshold) can submit one notification covering DORA and NIS2 Art 30 voluntary notification simultaneously.

Who benefits most: banks, insurers, investment firms

  • ▸All credit institutions (banks) are simultaneously NIS2 essential entities — currently dual-reporting; after 837, single report covers DORA + NIS2
  • ▸Insurance companies processing personal health data are NIS2 essential entities AND GDPR controllers — single report covers DORA + NIS2 + GDPR for the same breach
  • ▸FinTechs that are also payment institutions, e-money institutions and/or crypto-ASPs face the same consolidation benefit

What the single-entry point does NOT cover

The ENISA platform covers: NIS2, GDPR, DORA, eIDAS, and CER. The EU AI Act Art 73 serious incident reporting is not included. A financial entity that is also the AI system provider for its own credit scoring AI must still file a separate AI Act Art 73 report to the national MSA for a qualifying serious incident — in addition to the DORA + NIS2 + GDPR notification via the ENISA platform.

Practical compliance map for financial entities

1

Confirm DORA entity status

Verify your entity type is in-scope for DORA — credit institutions, insurance undertakings, investment firms, payment institutions, crypto-ASPs, and others. Micro-enterprises are exempt from most substantive obligations. All in-scope entities were required to comply from January 2025.

2

Identify Annex III §5 AI systems

Audit AI deployments: are any used for creditworthiness assessment (§5(b)) or life/health insurance risk assessment and pricing (§5(c))? These are high-risk under the AI Act and require conformity assessment (third-party in most cases), CE marking, technical documentation, risk management system, human oversight, and post-market monitoring — from August 2026.

3

Incorporate AI risk vectors into DORA ICT risk management

DORA Art 8 requires a comprehensive ICT risk management framework. Extend it to cover AI-specific risks: model drift, training data quality degradation, adversarial inputs, biased outputs, third-party AI model dependency. While not legally required to satisfy AI Act Art 9, integrated risk registries reduce duplication and demonstrate governance maturity to both DORA and AI Act supervisors.

4

Align AI provider contracts with DORA Art 30

DORA Art 30 ICT contracts must include audit rights, exit strategies, data portability, SLAs. Request AI Act documentation as part of procurement: technical documentation (Art 11), Declaration of Conformity (Art 47), instructions for use (Art 26(6)). For critical AI providers, run DORA supply chain risk assessment AND AI Act conformity documentation review in a single procurement checkpoint.

5

Prepare dual incident response procedures

Establish separate incident response tracks: DORA Art 19 report → financial supervisor (4h/72h/monthly) and AI Act Art 73 report → national MSA (if you are the provider). When 837 ENISA platform becomes operational, consolidate DORA + NIS2 + GDPR notifications there. Monitor for potential extension of the ENISA platform to AI Act Art 73.

Frequently Asked Questions

Do the EU AI Act and DORA both apply to financial entities deploying AI?+

Yes. DORA (Regulation (EU) 2022/2554) applies to financial entities including credit institutions, payment institutions, investment firms, insurance and reinsurance undertakings, crypto-asset service providers, central counterparties, and others. The EU AI Act applies to AI systems placed on or used in the EU market. Where a financial entity deploys an AI system for creditworthiness assessment or life/health insurance risk pricing — both listed in Annex III §5 of the AI Act as high-risk — that entity must simultaneously comply with: DORA's ICT risk management framework, ICT third-party risk obligations, and incident reporting requirements; AND the AI Act's Chapter III high-risk AI obligations (risk management, conformity assessment, human oversight, technical documentation, CE marking).

Does DORA's ICT risk management satisfy the AI Act's risk management requirement?+

No. DORA Art 8 and AI Act Art 9 are parallel obligations measuring different risks. DORA Art 8 requires a comprehensive, well-documented ICT risk management framework covering availability, performance, and integrity of ICT systems — focused on the operational resilience of financial services. AI Act Art 9 requires a risk management system for each individual high-risk AI system, continuously updated throughout the lifecycle, covering risks to the health, safety, and fundamental rights of the natural persons affected by the AI. A DORA ICT risk management framework does not satisfy AI Act Art 9, though practical implementation can be designed to inform both: risk registry entries, change management logs, and incident records can cover both dimensions if structured correctly.

Are AI system providers ICT third-party service providers under DORA?+

In practice, yes. DORA Art 28 requires financial entities to identify ICT third-party service providers and manage the associated risks. An AI system provider delivering a SaaS credit scoring tool, fraud detection platform, or risk assessment AI to a bank or insurer qualifies as an ICT third-party service provider. The financial entity must assess the provider's security, contractual provisions must include audit rights, exit strategies, and data portability (DORA Art 30). The AI Act's documentation chain — technical documentation (Art 11), Declaration of Conformity (Art 47), and instructions for use (Art 26(6)) — provides supporting documentation that can be used in DORA supply chain due diligence.

What is a Critical ICT Third-Party Provider and could an AI provider be one?+

A Critical ICT Third-Party Provider (CTPP) is an ICT service provider designated by the Joint Committee of the European Supervisory Authorities (ESAs) based on its systemic importance to the financial sector. Designated CTPPs are subject to a direct oversight framework by the ESAs — independent of AI Act enforcement. An AI system provider that is critical infrastructure to the European financial system — for example, a provider whose AI runs credit risk assessments across dozens of major banks — could be designated a CTPP. In that case, it faces both: the ESA oversight framework under DORA AND the AI Act obligations applicable to it as an AI system provider.

What does COM(2025) 837 change about DORA incident reporting?+

COM(2025) 837 proposes to amend DORA Art 19(1) and 19(2) so that financial entities submit major ICT incident reports and voluntary significant cyber threat notifications via the NIS2 Art 23a single-entry ENISA platform. This is significant because most banks, insurers, and investment firms are also NIS2 essential entities — meaning a single major ICT incident currently requires separate reports to the DORA competent authority and the NIS2 authority. After 837, one submission via ENISA covers both. GDPR personal data breach notification can also be bundled if the incident involves personal data. Note: AI Act Art 73 serious incident reporting is NOT included in the single-entry point — it remains a separate obligation to the national market surveillance authority.

Related Compliance Guides

High-Risk AI Systems — Chapter III

Annex III §5 — creditworthiness and insurance AI triggers full high-risk conformity assessment.

EU AI Act vs GDPR

Financial AI processing personal data faces GDPR on top of DORA and AI Act obligations.

EU AI Act vs NIS2

Banks and insurers are NIS2 essential entities — 837 single-entry point covers all three: DORA + NIS2 + GDPR.

AI Act Market Surveillance & Enforcement

AI Act Art 73 serious incident reporting — remains separate from ENISA platform.

Digital Omnibus 837 — Full Overview

Complete overview of COM(2025) 837 including DORA Art 19 amendments and ENISA single-entry point.

Data Breach Notification Changes (837)

GDPR + NIS2 + DORA personal data breach notification via one submission — explained.

Manage your financial AI obligations across DORA and the AI Act

Regumatrix maps your financial AI systems against Annex III §5, identifies ICT third-party risk obligations, and generates a combined DORA + AI Act compliance checklist — including 837 incident reporting implications.

Start free analysis