RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceGPAI Obligations
GPAI — Articles 51–55In force since 2 August 2025

EU AI Act: GPAI Model Provider Obligations

General-Purpose AI (GPAI) models — large language models, multimodal foundation models, and other models capable of serving a wide range of tasks — are subject to a dedicated chapter in the EU AI Act. Obligations differ based on whether your model poses systemic risk.

Article 101 penalty — GPAI obligations are already in force

GPAI model obligations under Articles 51–55 applied from 2 August 2025. Non-compliant GPAI providers face penalties of up to €15,000,000 or 3% of global annual turnover, whichever is higher (Article 101). Systemic risk models face additional scrutiny from the AI Office, including adversarial testing requirements and mandatory incident reporting.

Not sure whether your model qualifies as GPAI or triggers systemic risk rules?

Describe your model or the product you are building and Regumatrix checks it against Articles 51–55 and the full regulation — in about 30 seconds. Your first 3 analyses are free.

Check my GPAI model — 3 free analyses included

What is a GPAI model? (Article 3(63))

A model trained on large amounts of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks. This includes GPT-4 class models, Claude, Gemini, Llama, Mistral, and other large foundation models. Crucially, models used only for research and not yet placed on the market are excluded.

Systemic risk threshold

A GPAI model is presumed to pose systemic risk if it was trained using a compute of more than 10²⁵ floating-point operations (FLOPs) (Article 51(1)). The AI Office may also designate models as systemic risk based on capability evaluation. Systemic-risk models face significantly elevated obligations under Articles 54 and 55.

GPAI provider obligations (Articles 51–55)

Art. 51Classify your GPAI model — determine if it has systemic risk (≥10²⁵ FLOPs threshold)
Art. 52Provide technical documentation and training summaries — make available to downstream providers
Art. 53Comply with EU copyright law; publish training data summary for rights-holder requests
Art. 54For systemic risk models: adversarial testing, incident reporting, cybersecurity measures
Art. 55Systemic risk models must notify the AI Office; ongoing model evaluation obligations apply
Art. 56Codes of practice — GPAI providers may adopt or contribute to Commission-approved codes

Building on a GPAI model? (Downstream provider obligations)

Obtain technical documentation from the GPAI provider before building on their model
Understand which provider obligations flow down to your integration (Art. 25)
If your downstream AI system is high-risk, you remain responsible for Chapters III obligations
Document which model version you used and when

See Article 25 for the provider chain responsibility allocation.

GPAI grey areas and common misclassifications

Many teams building on or releasing AI models underestimate whether they are a GPAI model provider. The classification turns on whether your model displays “significant generality” and is made available to third parties. Check if any of these apply:

  • You fine-tune an open-source GPAI model and release it publicly — you may be a GPAI provider
  • Your API lets developers use your model for tasks you did not explicitly design for
  • You are building on a GPAI model but haven't requested technical documentation from the upstream provider
  • Your model is positioned as a research release but is accessible via public API or download
  • You have not assessed whether your training compute exceeds the 10²⁵ FLOPs systemic risk threshold
Check my GPAI obligations →

Frequently asked questions

When did GPAI obligations under the EU AI Act start?▾
GPAI model obligations under Articles 51–55 of the EU AI Act apply from 2 August 2025. This includes the requirement to maintain technical documentation, publish training data summaries, comply with EU copyright law, and register with the EU AI Office. All GPAI model providers — whether releasing open-source or proprietary models — must comply.
What triggers the systemic risk rules for GPAI models?▾
A GPAI model is presumed to have systemic risk if it was trained using more than 10²⁵ FLOPs (floating point operations). This currently captures only the very largest frontier models. The threshold may be updated by the EU AI Office. Models below this threshold are still GPAI models with base obligations under Article 52–53, but the additional systemic risk requirements (adversarial testing, incident reporting, cybersecurity measures) only apply to models above the threshold.
Does the open-source exemption remove all GPAI obligations?▾
No. Open-source GPAI model providers still have base obligations under Articles 52–53: copyright training data summary and passing technical information to downstream providers. The exemption removes the requirement for a full technical documentation package under Article 52. However, open-source models with systemic risk (≥10²⁵ FLOPs) are NOT exempt and face all systemic risk obligations despite being open-source.

Related compliance guides

Is my AI high-risk? (Checklist)All enforcement deadlinesFines & penalties (Article 99)Healthcare AI obligationsHR & Recruitment AI guideFinancial services AI guide

Know your exact GPAI obligations — obligations are in force now

GPAI obligations under Articles 51–55 are already in force — if you are releasing or have already released a general-purpose AI model, you need to know whether you are compliant today, not in 2026.

Describe your model or product in plain language. Regumatrix checks it against every article of the EU AI Act and returns your risk tier, GPAI classification, the exact obligations that apply, and your fine exposure under Article 101. Eight sections. About 30 seconds.

Analyse my GPAI model free →All compliance guides

8-section report · Article citations · ~30 seconds · No credit card