General-Purpose AI (GPAI) models — large language models, multimodal foundation models, and other models capable of serving a wide range of tasks — are subject to a dedicated chapter in the EU AI Act. Obligations differ based on whether your model poses systemic risk.
Article 101 penalty — GPAI obligations are already in force
GPAI model obligations under Articles 51–55 applied from 2 August 2025. Non-compliant GPAI providers face penalties of up to €15,000,000 or 3% of global annual turnover, whichever is higher (Article 101). Systemic risk models face additional scrutiny from the AI Office, including adversarial testing requirements and mandatory incident reporting.
Not sure whether your model qualifies as GPAI or triggers systemic risk rules?
Describe your model or the product you are building and Regumatrix checks it against Articles 51–55 and the full regulation — in about 30 seconds. Your first 3 analyses are free.
Check my GPAI model — 3 free analyses includedA model trained on large amounts of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks. This includes GPT-4 class models, Claude, Gemini, Llama, Mistral, and other large foundation models. Crucially, models used only for research and not yet placed on the market are excluded.
A GPAI model is presumed to pose systemic risk if it was trained using a compute of more than 10²⁵ floating-point operations (FLOPs) (Article 51(1)). The AI Office may also designate models as systemic risk based on capability evaluation. Systemic-risk models face significantly elevated obligations under Articles 54 and 55.
See Article 25 for the provider chain responsibility allocation.
Many teams building on or releasing AI models underestimate whether they are a GPAI model provider. The classification turns on whether your model displays “significant generality” and is made available to third parties. Check if any of these apply:
GPAI obligations under Articles 51–55 are already in force — if you are releasing or have already released a general-purpose AI model, you need to know whether you are compliant today, not in 2026.
Describe your model or product in plain language. Regumatrix checks it against every article of the EU AI Act and returns your risk tier, GPAI classification, the exact obligations that apply, and your fine exposure under Article 101. Eight sections. About 30 seconds.
8-section report · Article citations · ~30 seconds · No credit card