RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceMarket Surveillance
Chapter IX — Enforcement836 — AI Office powers expanded

EU AI Act Market Surveillance: Authorities, Powers, and How to Lodge a Complaint

Enforcement under the EU AI Act is divided between national market surveillance authorities and the AI Office. Who investigates your system depends on the type of AI, how it is used, and which sector it operates in.

Built on the EU product safety enforcement framework

The EU AI Act does not create a wholly new enforcement infrastructure. Art 74(1) applies Regulation (EU) 2019/1020 (the Union Market Surveillance Regulation) to AI systems, treating them as “products” and AI operators as “economic operators” for enforcement purposes. This means the market surveillance powers, mutual assistance procedures, and safeguard mechanisms that apply to physical products also apply to AI systems — adapted for the specific characteristics of AI.

Which authority has jurisdiction over your AI system? Regumatrix identifies your applicable MSA and enforcement path automatically →

Which authority has jurisdiction?

Article 74 designates different market surveillance authorities depending on the AI system's sector and use. Identify all categories that apply to your system.

Art 74(3)Art 74(6)

High-risk AI in regulated products

Authority: Sectoral market surveillance authority

For AI systems embedded in products covered by Union harmonisation legislation listed in Annex I Section A (e.g., medical devices, machinery, toys), the MSA is whichever authority is responsible for market surveillance under that product regulation. For AI systems used by financial institutions, the national financial supervisory authority acts as MSA.

Art 74(8)

AI used by law enforcement, border management, justice

Authority: Data protection supervisory authority (or equivalent)

Member States must designate as MSA either the competent data protection supervisory authority under GDPR/LED (Directive 2016/680) or any other authority satisfying the independence conditions in Articles 41–44 of the LED. Market surveillance activities must not interfere with judicial authorities acting in their judicial capacity.

Art 74(9)

AI used by EU institutions, bodies, offices, or agencies

Authority: European Data Protection Supervisor (EDPS)

The EDPS acts as market surveillance authority for Union institutions, bodies, offices, and agencies that fall within the regulation's scope — except for the Court of Justice of the European Union when acting in its judicial capacity.

Art 75(1)Art 88(1)

General-Purpose AI (GPAI) models and systems

Authority: AI Office (via European Commission)

The Commission has exclusive powers to supervise and enforce Chapter V (Arts 53–55, Art 88). Where the GPAI model and the AI system using it are developed by the same provider, the AI Office has full market surveillance authority powers over that high-risk AI system too. National MSAs may request the Commission's assistance; the AI Office must supply relevant information within 30 days.

Key investigative powers

Art 74(12)

Full access to training and testing data

Market surveillance authorities must be granted full access to documentation and to training, validation, and testing datasets used to develop high-risk AI systems. This includes remote access via APIs or other technical means. Authorities are not required to justify access to data separately from their general market surveillance mandate.

Art 74(13)

Source code access — on reasoned request only

Access to source code requires a reasoned request and only where both conditions are met: (a) source code access is necessary to assess conformity under Chapter III Section 2, and (b) testing and auditing based on the documentation and data already provided have been exhausted or proved insufficient. This is a higher threshold than dataset access — it protects proprietary algorithms while still enabling compliance review in genuinely contested cases.

Art 76

Suspension of real-world testing

Market surveillance authorities supervise testing in real world conditions (regulatory sandboxes and other real-world testing). They may suspend or terminate testing, or require modifications, if a serious incident occurs or if the conditions of Articles 60–61 are not met.

Art 79Art 82

Corrective action, withdrawal, and recall

Where an AI system presents a risk to health, safety, or fundamental rights, the authority may require corrective action within 15 working days. If the operator fails to act, the authority can provisionally prohibit or restrict the system, withdraw it from the market, or recall it. Even a compliant AI system may be restricted under Article 82 if it presents a real-world risk despite meeting all formal requirements.

Art 77

Fundamental rights authorities: documentation requests

National public authorities supervising compliance with fundamental rights obligations (including non-discrimination law) may separately request and access documentation created under the EU AI Act when reviewing high-risk AI systems in Annex III. Where documentation is insufficient, they may request the MSA to organise technical testing of the system.

Individual remedyArt 85

Right to lodge a complaint (Article 85)

Article 85 provides that any natural or legal person who has grounds to consider that the EU AI Act has been infringed may submit a complaint to the relevant market surveillance authority. The exact text:

“Without prejudice to other administrative or judicial remedies, any natural or legal person having grounds to consider that there has been an infringement of the provisions of this Regulation may submit complaints to the relevant market surveillance authority. In accordance with Regulation (EU) 2019/1020, such complaints shall be taken into account for the purpose of conducting market surveillance activities, and shall be handled in line with the dedicated procedures established therefor by the market surveillance authorities.”
  • •Any natural person (individual) or legal person (company, NGO, trade association) may complain
  • •No requirement to be personally affected — anyone with reasonable grounds to believe a violation occurred can file
  • •The authority must take the complaint into account for its market surveillance activities
  • •Authorities handle complaints through their own established procedures
  • •Article 85 does not create a right to a specific individual remedy — it is a market surveillance trigger
  • •Affected individuals also have the separate right to explanation under Article 86
Current law — Chapter V / Chapter IX Section 5Art 88

AI Office exclusive enforcement of GPAI obligations

The Commission, acting through the AI Office, has exclusive competence to supervise and enforce Chapter V (General-Purpose AI Models, Arts 53–55) under Article 88(1). This means:

  • •Only the AI Office can investigate GPAI model providers for non-compliance with Arts 53–55
  • •Downstream providers (application developers) can lodge complaints about GPAI providers with the AI Office (Art 89(2))
  • •The AI Office monitors GPAI compliance including adherence to approved codes of practice (Art 89(1))
  • •For systemic risks, the Commission may request providers to implement mitigation measures, or restrict/withdraw/recall the model (Art 93)
  • •Before formal measures, the AI Office may initiate a structured dialogue with the GPAI provider (Art 93(2))
  • •National MSAs can request Commission assistance where necessary and proportionate (Art 88(2))
Art 78

Confidentiality of information obtained

All information and data obtained by market surveillance authorities, the Commission, and notified bodies in the course of applying the EU AI Act is subject to strict confidentiality obligations under Article 78. Specifically protected are: intellectual property rights and trade secrets (including source code), national and public security interests, ongoing criminal or administrative proceedings, and information classified under Union or national law. Authorities must put in place adequate cybersecurity measures to protect acquired data and must delete data as soon as it is no longer needed. Special rules apply for law enforcement operational data — including the requirement that technical documentation remains within the premises of law enforcement authorities when they are also providers of high-risk AI systems.

COM(2025) 837 (the Digital Omnibus II regulation) does not amend the market surveillance framework established by Articles 74–84 or the enforcement provisions of Chapter IX. See the 837 overview →

Frequently asked questions

Which authority enforces the EU AI Act for my AI system?

It depends on the type of AI system and how it is used. For high-risk AI systems integrated into regulated products (e.g., medical devices, machinery), the market surveillance authority is the sectoral authority designated under the relevant Union harmonisation legislation. For high-risk AI systems used by financial institutions, it is the national financial supervisory authority. For high-risk AI systems used by law enforcement, border management, or justice authorities, Member States designate the data protection supervisory authority or another authority under conditions matching Directive 2016/680. For GPAI models and systems based on GPAI models developed by the same provider, the AI Office at the Commission has exclusive supervisory and enforcement powers under Article 88.

Can an individual file a complaint about an AI system under the EU AI Act?

Yes. Article 85 of the EU AI Act gives any natural or legal person the right to lodge a complaint with the relevant market surveillance authority if they have grounds to consider that the regulation has been infringed. The complaint must be taken into account for market surveillance activities. Market surveillance authorities handle complaints through their established procedures. Article 85 is separate from the right to explanation under Article 86, which gives affected persons a right to ask for an explanation of the specific decision made about them.

What can market surveillance authorities access from AI providers?

Under Article 74(12), market surveillance authorities are granted full access to documentation as well as training, validation, and testing datasets — including remote access via APIs or other technical means. Under Article 74(13), authorities may also request access to the source code of a high-risk AI system, but only on a reasoned request and only where: (a) source code access is necessary to assess conformity under Chapter III Section 2, and (b) testing and auditing based on data and documentation have been exhausted or proved insufficient. All information obtained is subject to the confidentiality obligations in Article 78.

What is the AI Office's role in EU AI Act enforcement?

The AI Office, established within the European Commission, has exclusive powers to supervise and enforce Chapter V (General-Purpose AI Models) under Article 88. This means only the Commission — acting through the AI Office — can investigate and take enforcement action against GPAI model providers. National market surveillance authorities may request the Commission to exercise those powers where necessary. The AI Office also coordinates joint investigations across Member States (Art 74(11)), provides 30-day information assistance to national authorities investigating high-risk systems built on GPAI models (Art 75(3)), and monitors compliance with approved codes of practice by GPAI providers (Art 89).

What happens if a market surveillance authority finds an AI system non-compliant?

Under Article 79, the authority evaluates the system and, if it finds non-compliance, requires the operator to take corrective action within 15 working days (or sooner under applicable Union harmonisation legislation). If the operator fails to act, the authority can provisionally prohibit or restrict the system, withdraw it from the market, or recall it. These provisional measures are notified to the Commission and other Member States. If no other Member State or the Commission objects within three months (30 days for Article 5 prohibited-practice violations), the measure is deemed justified. Operators retain procedural rights under Article 18 of Regulation (EU) 2019/1020.

Key compliance actions for market surveillance readiness

  • Identify which market surveillance authority has jurisdiction over each AI system you deploy or provide
  • Ensure training, validation, and testing datasets are accessible and auditable — authorities have direct access rights
  • Do not rely on source code confidentiality as a barrier to compliance review — MSAs can compel access on reasoned request
  • Establish internal procedures to respond to MSA corrective action orders within 15 working days
  • If your system is built on a GPAI model, the AI Office (not your national MSA) investigates the model provider — but you remain responsible for your application's compliance
  • Document post-market monitoring data under Art 72 — MSAs can request this during investigations

Related guides

EU AI Act Penalties

Art 99 — fine tiers, €35M/7% for Art 5, €15M/3% for high-risk, €7.5M/1.5% for GPAI

Right to Explanation of AI Decisions

Art 86 — individual right to ask deployers how a high-risk AI system shaped a decision

High-Risk AI Checklist

Classify your system against all 8 Annex III domains before MSA contact

AI Provider Obligations

Full Chapter III stack — conformity assessment, CE marking, EU database registration

Conformity Assessment

Internal control (Annex VI) vs notified body (Annex VII) — determine your track

Prohibited AI Practices (Art 5)

The 8 banned practices — Art 5 violations trigger 30-day (not 3-month) Union safeguard procedure

Know your enforcement exposure before an investigation starts

Regumatrix identifies which market surveillance authority governs your AI system, what documentation you must be ready to provide, and whether your system could trigger an Article 85 complaint — so you can address gaps before an investigation opens.

Get started free