RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceAI Provider Obligations
You build, modify, or distribute AI systemsUp to €15M or 3% of global turnover (Art. 99)

AI Provider Obligations: Your Complete EU AI Act Checklist

If you develop an AI system and place it on the EU market — or have it developed under your name or trademark — you are a “provider” under the EU AI Act. Article 16 sets out 12 mandatory obligations. Here is every one of them, organised by the order you need to meet them.

Placing a non-compliant high-risk AI system on the EU market triggers Article 99 penalties from day one.

Breaches of Chapter III high-risk provider obligations carry fines of up to €15,000,000 or 3% of global annual turnover (whichever is higher). Prohibited AI practices under Article 5 reach €35,000,000 or 7%. For SMEs and SMCs, Article 99(6) caps penalties at the lower of the two figures — build this into your risk analysis.

GPAI providers — Chapter V obligations

In force since August 2025

High-risk providers — Chapter III obligations

4 months away

Not sure whether your system is high-risk?

Regumatrix checks your system against every Annex III category, returns your risk tier and Annex classification, and lists every Article 16 obligation that fires — including your conformity assessment route, CE marking requirements, and fine exposure under Article 99. Takes about 30 seconds.

Check my system now — 3 free analyses included
PHASE 1 — DESIGN & BUILD

Before you build: requirements your system must meet

These are Section 2 requirements (Arts 8–15). They determine how your AI system is designed and trained. They cannot be retrofitted after launch — build them in from the start.

Art. 9

Risk management system

Set up a continuous, iterative risk management process running through the entire product lifecycle — from design through decommissioning. Identify and analyse all known and foreseeable risks to health, safety, and fundamental rights. Evaluate risks under normal use and under conditions of reasonably foreseeable misuse. Test against prior-defined metrics before placing on the market.

Art. 10

Data and data governance

Training, validation, and testing datasets must be relevant, representative, free of errors to the extent possible, and examined for bias. Art. 10(5) allows processing special category personal data strictly for bias detection — only under six strict conditions including pseudonymisation, no third-party access, and mandatory deletion after use.

Art. 14

Human oversight design

Design in tools that let oversight persons understand what the system can and cannot do, detect anomalies and unexpected performance, avoid automation bias, override or disregard outputs when needed, and halt the system completely via a stop function. These are design-time requirements — they cannot be retrofitted after launch.

Art. 15

Accuracy, robustness and cybersecurity

Achieve an appropriate level of accuracy for the intended purpose throughout the system's lifetime. The system must be resilient against errors, faults, and inconsistencies. Address cybersecurity risks: model poisoning, adversarial inputs, and unauthorised third-party attempts to alter the system's outputs or behaviour.

Art. 17

Quality management system

Put a documented QMS in place before conformity assessment. It must cover: regulatory compliance strategy, design control, development quality, test and validation procedures, technical specs and standards, data management, risk management (Art. 9), post-market monitoring (Art. 72), and incident reporting (Art. 73). Implementation must be proportionate to organisation size.

PHASE 2 — BEFORE LAUNCH

Before you place on the market: conformity, CE marking, and registration

These steps must all be completed before you sell, deploy, or make your system available. Skipping any one of them puts you in non-compliance the moment you launch.

Art. 11

Technical documentation

Draw up Annex IV documentation before placing the system on the market and keep it up to date. Minimum content includes system description, design specifications, training data details, logging capabilities, conformity assessment steps, and intended purpose. SMEs and start-ups may use the Commission's simplified Annex IV form — notified bodies must accept it.

Art. 43

Conformity assessment

Annex III point 1 (biometric identification): notified body required unless harmonised standards apply. Annex III points 2–8 (all other high-risk categories): self-assess using Annex VI internal control procedure. Annex I product-safety systems: follow the relevant sectoral law (e.g. MDR for medical devices). After a substantial modification, repeat the assessment from scratch.

Art. 13

Instructions for use — transparency to deployers

Provide deployers with clear instructions covering: provider identity and contact, system capabilities and intended purpose, performance characteristics and known limitations, circumstances that may affect accuracy or cause drift, monitoring and maintenance guidance, and any restrictions on use. Deployers cannot complete their GDPR DPIA (Art. 35) without adequate Art. 13 documentation from you.

Art. 47 + 48

EU declaration of conformity + CE marking

Draw up a signed, machine-readable EU declaration of conformity (Annex V content) for each high-risk AI system. Affix the CE marking visibly and legibly — or digitally for software-only systems. Keep the declaration available for 10 years. By signing, you assume full responsibility for compliance with Section 2 requirements.

Art. 49(1)

EU database registration

Register yourself and your system in the EU AI database before placing on the market or putting into service — except for Annex III point 2 systems, which are registered at national level. Law enforcement, immigration, and border control systems use a secure non-public section. Note: COM(2025) 836 proposes deleting Art. 49(2), removing registration for non-high-risk Art. 6(3) systems.

PHASE 3 — ONGOING AFTER LAUNCH

After launch: monitoring, records, and authority cooperation

Provider obligations do not end at launch. These are continuous obligations from the moment your system goes live.

Art. 18 + 19

Documentation and log retention

Keep all technical documentation, QMS documentation, notified body decisions, and the EU declaration of conformity for 10 years after the system is placed on the market (Art. 18). Separately, retain the automatically generated operational logs from your high-risk AI system for at least 6 months — longer if required by applicable law (Art. 19).

Art. 72

Post-market monitoring

Establish and document a post-market monitoring system proportionate to the system's risks. Actively and systematically collect, document, and analyse performance data throughout the product's lifetime — including from deployers. Include a post-market monitoring plan in your Annex IV technical documentation. Under COM(2025) 836, the mandatory harmonised template is removed and replaced by Commission guidance.

Art. 20

Corrective actions and duty to inform

If you have reason to believe your system is non-compliant with this Regulation, immediately take corrective action: bring it into conformity, withdraw, disable, or recall it. Notify distributors, deployers, authorised representatives, and importers. If the system poses a risk within the meaning of Art. 79(1), immediately investigate and notify the relevant market surveillance authority and notified body.

Art. 21

Cooperate with competent authorities

Upon a reasoned request from a national competent authority, provide all information and documentation needed to demonstrate compliance with Section 2 requirements — in a language the authority can easily understand. Also provide access to the automatically generated logs when requested. Authorities must treat this information as confidential (Art. 78).

GPAI providers: a different obligation set (Article 53)

If you develop a general-purpose AI model — a foundation model that others integrate into products — you have a separate obligation set under Chapter V. You do not run conformity assessment, do not affix CE marking, and do not register in the EU database. Instead, you have four obligations owed to downstream users and the AI Office. These have applied since August 2025.

Art. 53(1)(a)

Technical documentation

Draw up and keep up to date technical documentation including the training process, testing procedures, and evaluation results — minimum content set out in Annex XI. Provide to the AI Office and national competent authorities upon request. Open-weights models under a free and open-source licence are exempt from this obligation — unless the model has systemic risk (>10²⁵ FLOPs threshold under Art. 51).

Art. 53(1)(b)

Downstream provider information

Make available to downstream AI system providers information that enables them to understand the model's capabilities and limitations and to comply with their own obligations. Minimum content set out in Annex XII. This obligation enables downstream providers to build compliant products — without it, they cannot complete their own technical documentation or conformity assessments.

Art. 53(1)(c)

Copyright compliance policy

Put in place a policy to comply with Union copyright and related rights law, including identifying and honouring rights reservation notices expressed under Article 4(3) of Directive (EU) 2019/790 using state-of-the-art technologies. This obligation applies to all GPAI providers — including open-source model providers.

Art. 53(1)(d)

Training data summary

Draw up and make publicly available a sufficiently detailed summary about the content used for training, using an AI Office template. This must be publicly available — not just on request. Open-source models with publicly released weights are exempt from Art. 53(1)(a) and (b) but not from (c) and (d). Models above the systemic risk threshold are never exempt from any obligation under Art. 53.

Warning signs: you may have assumed provider obligations without realising it

  • → You put your company name or logo on a third-party AI tool before selling it
  • → You changed the intended purpose of a licensed or purchased AI system
  • → You made substantial changes to a deployed system's design or training
  • → You integrated a GPAI model into a product and sell it under your brand
  • → You are a product manufacturer who added an AI safety component (Art. 25)
  • → You contractually assumed provider obligations from the original developer

All six situations make you the provider for purposes of Article 16 — even if you did not write the original AI system.

Check whether you are a provider — free
PROPOSAL — not yet enacted lawCOM(2025) 836 — Digital Omnibus on AI

What 836 would change for providers — 5 SME/SMC simplifications

If enacted, the following changes reduce the compliance burden for SMEs and a new category of “small mid-cap enterprises” (SMCs). SMC is defined as an enterprise with up to 750 employees and up to €150M annual revenue — bridging the gap between current SME thresholds and large enterprise status.

Simplified technical documentation form for SMCs and SMEs (836 Art 1 pt8 — amends Art 11(1))

The Commission would be required to establish a simplified technical documentation form targeted at SMC and SME needs. Notified bodies would be required to accept this simplified form for conformity assessments. The full Annex IV form remains the default; SMCs and SMEs may opt into the simplified version instead.

Explicit QMS proportionality for SMCs (836 Art 1 pt9 — amends Art 17(2))

Article 17(2) would be updated to read: “proportionate to the size of the provider’s organisation, in particular, if the provider is an SMC or an SME, including a start-up.” Under current law, SMCs are not mentioned — only the general proportionality principle applies.

All SMEs can satisfy QMS requirements in a simplified manner (836 Art 1 pt21 — amends Art 63(1))

Under current law, only micro-enterprises have an explicit simplified QMS route via Article 63. If 836 is enacted, Article 63(1) is replaced to extend this to all SMEs including start-ups. The Commission would develop guidelines specifying which QMS elements can be met in a simplified manner.

Mandatory post-market monitoring template removed (836 Art 1 pt24 — amends Art 72(3))

The Commission was required to adopt an implementing act by 2 February 2026 establishing a harmonised post-market monitoring plan template. Under 836, this obligation is removed and replaced by a requirement to publish guidance instead. Providers gain flexibility in structuring their monitoring documentation.

SMC fine cap extended (836 Art 1 pt29 — amends Art 99(6))

Article 99(6) would be updated to apply the lower-of-percentage- or-fixed-amount cap to SMCs as well as SMEs. Under current law, an SMC with €5M revenue breaching the 3% cap faces up to €15M (the fixed maximum). Under 836, the 3% figure (€150,000) would apply instead — whichever is lower.

Status: COM(2025) 836 is in trilogue as of early 2026. If 836 passes and the Commission does not confirm harmonised standards readiness in time, the Chapter III provider obligations set out above would not apply until 2 December 2027 (Annex III systems) or 2 August 2028 (Annex I Section A systems) — whichever is earlier.

No changes are proposed under COM(2025) 837 for this topic. COM(2025) 836 changes are covered above.

Frequently asked questions

Who counts as an AI 'provider' under the EU AI Act?▾

A provider is any natural or legal person, public authority, agency or other body that develops an AI system — or has one developed — and places it on the market or puts it into service under their own name or trademark. That covers AI product companies, SaaS vendors, and software firms. It also covers any deployer or third party who (a) puts their name or trademark on a third-party AI system, (b) makes a substantial modification to it, or (c) changes its intended purpose so that new high-risk use cases arise. In all three situations, Article 25 makes that party a provider for the purposes of Article 16, regardless of who originally built the system.

Do I need an accredited notified body for conformity assessment?▾

It depends on the Annex III category. For Annex III point 1 systems — biometric identification — you need a third-party notified body unless harmonised standards exist and you apply them. For all other Annex III categories (points 2–8: critical infrastructure, education, employment, essential services, law enforcement, migration, administration of justice), you self-assess using the internal control procedure in Annex VI — no notified body is required. For Annex I product-safety systems (medical devices, vehicles, machinery), you follow the conformity assessment required by the relevant sectoral harmonisation legislation.

If I make substantial changes after launch, do provider obligations restart?▾

Yes. Article 43(4) requires a new conformity assessment whenever a high-risk AI system undergoes a substantial modification — regardless of whether the modified system will be redistributed or stays with the same deployer. One important carve-out: for systems that continue to learn after deployment, changes that were pre-determined by the provider at the time of the initial conformity assessment and documented in the technical documentation are not considered substantial modifications. Only unplanned design changes that alter the system's risk profile trigger the restart.

What is the difference between high-risk provider obligations and GPAI provider obligations?▾

High-risk AI providers (Chapter III, Arts 9–21, 43, 47–49, 72) build or sell AI systems that interact with people directly in high-risk domains like employment, credit scoring, healthcare, or biometrics. Their obligations include conformity assessment, CE marking, EU database registration, and technical documentation under Annex IV. GPAI providers (Chapter V, Art 53) develop foundation models that others integrate into products. GPAI obligations focus on upstream documentation (Annex XI), downstream information sharing (Annex XII), copyright policy, and publishing a training data summary. GPAI providers do not run conformity assessment, do not affix CE marking, and do not register in the EU database — unless their model is used in a high-risk system and they are also the system provider.

What does COM(2025) 836 change for SME and startup AI providers?▾

If enacted, COM(2025) 836 introduces four provider-specific reductions for SMEs and a new category of 'small mid-cap enterprises' (SMCs, defined as up to 750 employees and €150M revenue): (1) Article 11 is amended so the Commission must create a simplified technical documentation form for SMCs and SMEs — notified bodies must accept it. (2) Article 17(2) is updated to explicitly require QMS proportionality for SMCs. (3) Article 63(1) is replaced to extend the simplified QMS route to all SMEs — not just micro-enterprises. (4) Article 72(3) is changed so the mandatory post-market monitoring template is removed and replaced by Commission guidance, giving providers more flexibility. Additionally, Article 99(6) is updated so SMCs receive the same fine cap as SMEs: the lower of the percentage or fixed amount applies. These are proposals — not yet enacted law.

Related compliance guides

AI Deployer Obligations (Article 26)Is My AI High-Risk? ChecklistConformity Assessment Guide (Article 43)EU AI Act Fines & Penalties (Article 99)EU AI Act Compliance Timeline 2025–2030What Changes Under COM(2025) 836

Find out exactly which provider obligations apply to your AI system

Regumatrix analyses your AI system and returns: risk tier, Annex III classification, every Article 16 obligation that fires, your conformity assessment route (notified body vs self-assessment), fine exposure under Article 99, and an 8-section cited compliance report. Takes about 30 seconds.

Start free — 3 analyses included

No credit card required