RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceFinancial Services AI
High Risk — Annex III, Points 5(b) & 6Penalty: up to €15M or 3% of turnover

EU AI Act: Financial Services AI Compliance

AI systems used in creditworthiness assessment, insurance risk pricing, and financial decision-making affecting natural persons are high-risk under Annex III, points 5(b) and 6. Mandatory obligations apply from August 2026.

Article 99 penalty for high-risk financial AI non-compliance

Financial AI that does not meet Articles 9–17 obligations by the deadline carries a penalty of up to €15,000,000 or 3% of global annual turnover, whichever is higher. This is separate from — and in addition to — existing DORA and GDPR penalties. The deadline is 2 August 2026.

Not sure if your credit scoring or fraud detection model is high-risk?

Describe what your system does and Regumatrix checks it against Annex III, Article 6, and the full regulation — in about 30 seconds. Your first 3 analyses are free.

Check my financial AI system — 3 free analyses included

High-risk financial AI systems

  • Credit scoring and creditworthiness assessment AI
  • Insurance risk classification and pricing models
  • Fraud detection affecting account access decisions
  • AI for loan approval or rejection
  • AI used in the administration of social benefit entitlements
  • AI for collateral value assessment with binding outputs

Obligations for financial AI providers and deployers

Art. 9Risk management system addressing model risk, fairness and over-reliance
Art. 10Training data governance: representative, no historical bias encoding
Art. 11Technical documentation — maintained and available to supervisors
Art. 12Automatic logging sufficient for supervisory authority audit trail
Art. 13Transparency: customers must understand when AI is used in decisions affecting them
Art. 14Human oversight — AI must not autonomously approve or deny credit without oversight
Art. 22Where natural persons are involved, right to explanation for AI-assisted decisions
Art. 43Conformity assessment before placing the AI system on the market

DORA & GDPR overlap

Financial entities are already subject to DORA (Digital Operational Resilience Act) and GDPR Article 22 on automated decision-making. The EU AI Act adds a third compliance layer: the AI system itself must pass high-risk conformity requirements regardless of existing DORA or GDPR controls. These frameworks are complementary, not substitutable.

Financial AI grey zones to watch

The boundary between high-risk and limited-risk in financial AI turns on whether a system directly affects access to financial products for individuals. Many firms underclassify their tools. Check if any of these scenarios apply:

  • Fraud detection that can freeze accounts or deny transactions without human review
  • Insurance pricing models where the AI output directly sets the premium offered
  • AI that pre-screens applicants before a human underwriter sees the file
  • Credit bureau scoring models licensed to third-party lenders via API
  • AML risk-scoring systems that trigger customer exit or account restriction
Check my financial AI system →

Frequently asked questions

Is credit scoring AI high-risk under the EU AI Act?▾
Yes. AI systems used for creditworthiness assessment, credit scoring, loan decisioning, and insurance risk evaluation fall under Annex III point 5(b) of the EU AI Act and are classified as high-risk. Full high-risk obligations including risk management (Article 9), human oversight (Article 14), and conformity assessment apply from 2 August 2026.
Does the EU AI Act apply to fraud detection AI?▾
Fraud detection AI that influences individual credit or payment decisions may qualify as high-risk under Annex III. AI used purely for transaction pattern analysis without individual-level decision-making may fall into limited or minimal risk. The classification depends on whether the system directly affects access to financial services for individuals.
How does the EU AI Act interact with existing financial regulation (EBA, EIOPA)?▾
The EU AI Act operates alongside sector-specific financial regulation. EBA guidelines on model risk and EIOPA guidance on AI in insurance both overlap with AI Act requirements around explainability, data governance, and human oversight. Firms already complying with EBA/EIOPA AI frameworks will find significant overlap with Articles 9–15 of the EU AI Act.

Related compliance guides

Is my AI high-risk? (Checklist)Healthcare AI obligationsHR & Recruitment AI guideFines & penalties (Article 99)GPAI / foundation model rulesAll enforcement deadlines

Know your exact obligations before August 2026

Financial services AI sits at the intersection of three major regulatory frameworks: the EU AI Act, DORA, and GDPR Article 22. Compliance with one does not satisfy the others. Get a clear picture of your AI Act exposure specifically.

Describe your AI system in plain language. Regumatrix checks it against every article of the EU AI Act and returns your risk tier, Annex classification, the exact obligations that apply, and your fine exposure under Article 99. Eight sections. About 30 seconds.

Analyse my financial AI free →All compliance guides

8-section report · Article citations · ~30 seconds · No credit card