RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
Home/Compliance/Fundamental Rights Impact Assessment
Deployer ObligationPublic bodies + certain private deployers

Fundamental Rights Impact Assessment (FRIA)

Before deploying a high-risk AI system, public authorities and certain private deployers must assess how it could affect the fundamental rights of the people it will impact — and report to a market surveillance authority.

Why this matters

The FRIA is not advisory — it is a deployer obligation under Art 27. Skipping it, or completing it after deployment, is a violation of the AI Act. National competent authorities have broad enforcement powers under Art 99 and can require compliance actions, impose penalties, or halt deployments.

The FRIA must also be notified to the market surveillance authority after completion, with the AI Office template (Art 27(3)) — it is not a purely internal document.

Not sure whether your deployment triggers a FRIA?

Regumatrix confirms your deployer category, identifies every Article 26 and Article 27 obligation that fires for your system, and gives you a full cited report in about 30 seconds.

Check in 30 seconds →

Who must conduct an FRIA

Article 27(1) sets three specific deployer categories. If you fall outside all three, the FRIA obligation under Article 27 does not apply to you.

Category 1

Bodies governed by public law — government agencies, public institutions, and similar entities.

Category 2

Private entities providing public services — such as utilities operators or transport providers acting on behalf of public mandates.

Category 3

Deployers of Annex III §5(b) or §5(c) systems — credit scoring AI and insurance risk AI — regardless of whether they are public or private bodies.

Exception: High-risk AI systems in Annex III point 2 (biometric identification systems used for law enforcement) are excluded from Article 27 and have their own separate assessment requirements.

The 6 elements an FRIA must cover

Article 27(1)(a)–(f) specifies exactly what the assessment must address. Every element is mandatory.

a

Your processes and intended purpose

Describe the deployer's processes in which the high-risk AI system will be used, in line with its intended purpose as defined by the provider.

b

Duration and frequency of use

State the period of time within which, and the frequency with which, the system is intended to be used in those processes.

c

Categories of affected persons

Identify the categories of natural persons and groups likely to be affected by the system in your specific deployment context.

d

Specific risks of harm

Assess the specific risks of harm likely to affect the identified categories of persons — drawing on the information the provider supplied under Article 13.

e

Human oversight implementation

Describe how you will implement the human oversight measures specified in the provider's instructions for use.

f

Measures if risks materialise

Set out the measures to be taken if the assessed risks actually occur — including internal governance arrangements and accessible complaint mechanisms.

Key rules on timing and notification

First-use trigger

The FRIA obligation applies to the first use of the system (Art 27(2)). For similar subsequent uses of the same system, you may rely on the existing assessment — provided you update it if anything material changes.

Notify the authority

After completing the FRIA, submit the filled-out AI Office template to your market surveillance authority (Art 27(3)). Some deployers in Article 46(1) scenarios are exempt from notification — check if that exemption applies to your context.

FRIA complements a DPIA — never replaces it

If a GDPR DPIA already covers some of the same elements, your FRIA must complement that assessment, not substitute it (Art 27(4)). The two cover different ground: a DPIA is about personal data risks; an FRIA covers all fundamental rights.

AI Office template

The AI Office is developing a questionnaire template — including through an automated tool — to make the FRIA process easier (Art 27(5)). Use that template once available; it is the format market surveillance authorities expect to receive.

Can you rely on the provider's assessment?

Article 27(2) allows deployers to rely on FRIAs or impact assessments previously conducted by the provider, in similar cases. However, this has limits:

  • The obligation remains yours as deployer — you cannot simply forward the provider's document to the authority and treat the requirement as met.
  • The provider's assessment must address your specific deployment context, affected population, and the particular processes you will use the system for. A generic assessment from the provider rarely covers these adequately.
  • Any elements the provider's assessment does not cover must be completed by you.

FRIA results feed into EU database registration

Section C of Annex VIII requires public authority deployers to include a summary of their FRIA findings when registering their deployment in the EU database under Art 49(3). Completing the FRIA therefore has a direct consequence on your registration obligations — you cannot submit a compliant database entry without a completed assessment.

No changes are proposed under COM(2025) 836 or COM(2025) 837 for Article 27 of the EU AI Act.

Signs your FRIA may be incomplete or non-compliant

  • The assessment was done after the system went live rather than before the first use
  • The assessment was not submitted to the market surveillance authority using the AI Office template (Art 27(3))
  • The assessment relies entirely on the provider's generic documentation without adapting it to your specific deployment context
  • The six elements in Art 27(1)(a)–(f) were not all addressed — gaps in coverage render the whole assessment incomplete
  • The GDPR DPIA was treated as fully covering the FRIA — it does not (Art 27(4) requires them to be complementary)
  • The assessment was never updated after the deployment changed significantly
Check your deployer obligations →

Frequently asked questions

Does every deployer of a high-risk AI system have to do an FRIA?

No. Article 27(1) limits the obligation to three categories: bodies governed by public law, private entities providing public services, and deployers of Annex III §5(b) or §5(c) systems (credit scoring and insurance AI). Private companies deploying high-risk AI for purely internal or commercial purposes outside those categories are not covered. High-risk systems in Annex III point 2 (biometric identification for law enforcement) are also excluded from Article 27.

How often must the FRIA be conducted?

The FRIA obligation is triggered by the first use of the system (Article 27(2)). The deployer may rely on a previously conducted FRIA for similar uses of the same system. However, if any element covered by the assessment changes or becomes outdated during ongoing operation, the deployer must update it. It is not a one-time annual exercise — it continues to track changes.

Do I have to submit the FRIA to any authority?

Yes. Under Article 27(3), after completing the FRIA a deployer must notify the market surveillance authority of its results, submitting the filled-out template developed by the AI Office. In some circumstances outlined in Article 46(1), deployers may be exempt from this notification obligation.

Can a DPIA replace the FRIA?

No. Article 27(4) is explicit: if a GDPR data protection impact assessment (DPIA) already covers some of the same ground as the FRIA, the FRIA must complement it — not replace it. The two assessments serve different purposes. A DPIA focuses on personal data processing risks; the FRIA covers the broader impact on all fundamental rights, including non-data rights like non-discrimination, dignity and access to justice.

Can a deployer use the provider's assessment instead of conducting its own?

Partly. Article 27(2) says the deployer 'may rely on previously conducted fundamental rights impact assessments or existing impact assessments carried out by provider' in similar cases. However, the deployer cannot simply hand over the provider's assessment and call it done — the obligation is the deployer's own, and they must ensure it reflects their specific deployment context, affected population, and risk profile.

Related guides

AI Deployer Obligations

Full Article 26 checklist — the FRIA sits alongside these obligations

EU Database Registration

FRIA results are required in your Section C database entry

Human Oversight

Element (e) of the FRIA — how you implement the oversight measures

AI Act + GDPR

The relationship between FRIA and DPIA — where they overlap and where they don't

High-Risk AI Checklist

Confirm your system falls in an Annex III category before planning the FRIA

EU AI Act for SMEs

SME and SMC proportionality rules across all high-risk obligations

Know exactly which deployer obligations apply to you

Regumatrix checks your system against every Annex III domain, confirms whether you fall into an FRIA-triggering deployer category, lists every obligation under Articles 26 and 27, and gives you your fine exposure under Article 99 — in a cited, 8-section report in about 30 seconds. No credit card required.

Run your free analysis →