RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceAviation AI (EASA)
Annex I §20 — unmanned aircraft (EASA) onlyAnnex I §13 — civil aviation security€15M / 3% high-risk penaltyArticle 6(1) pathway2 Aug 2027 deadline — 16 months away

EU AI Act for Aviation AI

Aviation AI in the EU AI Act is covered by two Annex I entries: §20 (Regulation (EU) 2018/1139 — EASA Basic Regulation, scoped to unmanned aircraft only) and §13 (Regulation (EC) 300/2008 — civil aviation security). AI used as a safety component in EASA type-certified drones, or in civil aviation security screening systems, is high-risk under Article 6(1). Critically, manned commercial aviation AI — autopilot, TCAS, terrain avoidance — is outside the Annex I §20 scope. COM(2025) 836 proposes coordinating EASA with the AI Act and extending the deadline to 2 August 2028.

Critical scope limitation: Annex I §20 covers DRONES ONLY

Annex I §20 of the EU AI Act expressly limits its scope to Regulation (EU) 2018/1139 “where it concerns unmanned aircraft and their engines, propellers, parts and non-installed equipment to control them remotely”. AI used in manned commercial aviation (autopilot, collision avoidance, terrain awareness, automated landing systems in commercial aircraft) is not within Art 6(1) by way of Annex I §20. Such AI may be subject to EASA’s AI roadmap, specific EASA certification specifications, or other EU AI Act obligations — but not the high-risk classification via this pathway.

High-risk classification: full Chapter III obligations apply

Art 6(1) classification triggers the Art 99(4) penalty for violations: up to €15 million or 3% of global annual turnover. Non-compliance with EASA certification requirements — now incorporating AI Act dimensions — also has separate consequences under EU aviation law.

Does your aviation AI system trigger Art 6(1)?

Regumatrix analyses your system description against all AI Act classification pathways — including Art 6(1) Annex I §§13 and 20 — and produces a cited compliance report with your exact obligations, conformity route, and fine exposure in about 30 seconds.

Check in 30 seconds — 3 free analyses

Article 6(1) applied to aviation — two Annex I entries

The AI Act’s Article 6(1) high-risk classification applies when two cumulative conditions are met. Both entries relevant to aviation AI sit in Annex I.

Article 6(1) — HIGH-RISK (both conditions required)

(a) The AI system is intended to be used as a safety component of a product — or is itself the product — covered by Union harmonisation legislation in Annex I; AND

(b) that product is required to undergo third-party conformity assessment under the relevant Annex I legislation (e.g., EASA type certification, CAA type certificate equivalent, or civil aviation security certification).

Annex I §20 — Regulation (EU) 2018/1139 (EASA Basic Regulation)

Covers the regulatory framework for civil aviation safety in the EU. For AI Act purposes, the scope is limited to the design, production and placing on the market of:

  • ▸ Unmanned aircraft (drones/UAVs) referred to in Article 2(1)(a) and 2(1)(b) of Regulation 2018/1139
  • ▸ Their engines, propellers, parts
  • ▸ Equipment used to control them remotely

The primary implementing regulations are Reg (EU) 2019/945 (design, production and placing on market of UAS) and Reg (EU) 2019/947 (operational rules for UAS). Both feed into the 2018/1139 framework.

Annex I §13 — Regulation (EC) No 300/2008 (Civil Aviation Security)

Covers the EU framework for civil aviation security including airport security screening, access control, cargo security, and identity verification. AI used as a safety component in security screening equipment at airports — automated explosive-detection, biometric access control, threat-object recognition at security checkpoints — falls within this entry where the product requires third-party assessment under the relevant sectoral act.

EASA UAS categories: which drone AI systems trigger Art 6(1)?

EASA divides UAS operations into three categories. Whether Art 6(1) is triggered depends on the category, as it determines whether third-party conformity assessment — the second Art 6(1) condition — is required.

1

Open Category — Art 6(1) does NOT trigger

Low-risk operations. Product requirements apply (Reg 2019/945 classes C0–C6) but no EASA type certificate or national CAA third-party conformity assessment is required. Art 6(1) condition (b) is not met — self-declaration of conformity is sufficient. AI in open-category drones is not high-risk via the Annex I §20 pathway.

2

Specific Category — Situational; requires case analysis

Medium-risk operations requiring authorisation — either via a standard scenario (STS), pre-determined risk assessment (PDRA), or full SORA. Some specific-category operations require a third-party national CAA assessment. Where this applies to an AI safety component, Art 6(1) may be triggered. Whether a particular specific-category drone triggers Art 6(1) requires analysis of the specific authorisation route — it is not automatic.

3

Certified Category — Art 6(1) DOES trigger

High-risk operations requiring an EASA type certificate or equivalent (operations over people, in controlled airspace, cargo transport, urban air mobility). The EASA type certification procedure is the mandatory third-party conformity assessment. AI safety components in certified-category UAS clearly satisfy both Art 6(1) conditions and are classified high-risk. Providers must comply with Arts 9–15 (Chapter III Section 2) from 2 August 2027.

Civil aviation security AI — Annex I §13

Regulation (EC) No 300/2008 establishes common rules for civil aviation security across EU airports. AI systems used as safety components in aviation security screening and detection functions are high-risk where the product requires third-party assessment.

Airport AI in scope (§13 pathway)

  • • Automated threat detection AI in X-ray screening systems
  • • AI-based explosive detection systems (EDS) at passenger checkpoints and cargo
  • • Biometric AI for airside access control (identity verification)
  • • AI-based body scanner image analysis (WTMD/BIS)
  • • Prohibited-item recognition AI in hold-baggage screening

Aviation AI outside §13 scope

  • • Passenger experience AI (chatbots, recommendation engines)
  • • Airport operations AI (slot management, scheduling)
  • • Ground handling AI not in security screening function
  • • Revenue management or pricing AI
  • • CCTV AI with general situational awareness only (not security screening)

Conformity assessment: EASA certification + AI Act obligations

Like automotive, aviation systems fall under Annex I Section B — the category of EU harmonisation legislation not covered by Article 43(3)’s explicit sectoral notified-body routing. The EASA type certificate procedure (for certified-category UAS) or the relevant security certification process (for §13 systems) serves as the conformity assessment route, incorporating the AI Act Chapter III Section 2 requirements.

Section B distinction — why Art 43(3) does not apply

Article 43(3) explicitly routes AI Act compliance through sectoral notified bodies for Section A products only (medical devices, machinery, pressure equipment, etc.). Aviation, automotive, marine, and rail fall under Section B. No equivalent provision routes Section B conformity through a notified body in the current AI Act text. Instead, the AI Act Chapter III Section 2 obligations (Arts 9–15) apply directly to the AI system and must be satisfied as part of or alongside the EASA type-approval or sectoral certification procedure.

Chapter III Section 2 obligations for aviation AI providers

Art 9

Risk management system — including drone malfunction scenarios, loss of link failure modes, adversarial input scenarios (e.g., GPS spoofing attacks on navigation AI).

Art 10

Data governance — training data for perception/navigation AI must be diverse across environments (urban/rural, weather conditions, lighting), representing the full operational domain.

Art 11

Technical documentation (Annex IV) — can be integrated into EASA type certificate technical file.

Art 12

Logging — automatic event recording for post-incident analysis; important for accident investigation.

Art 13

Transparency — instructions for use covering operational domain limitations, weather/environment constraints, accuracy metrics for AI-dependent navigation or detect-and-avoid.

Art 14

Human oversight — remote pilot override capabilities, manual intervention protocols, alert systems when AI confidence is low.

Art 15

Accuracy, robustness, cybersecurity — resilience against adversarial interference (jamming, spoofing) and consistent performance across the declared operational design domain.

Who is the “provider” for aviation AI?

1

UAS / drone manufacturer (certified category)

The entity holding (or applying for) the EASA type certificate for a certified-category UAS and placing it on the EU market under their name or trademark is the AI Act provider for the AI safety components of that drone.

2

AI safety component supplier (e.g., detect-and-avoid AI supplier)

Where an AI subsystem — such as a detect-and-avoid (DAA) algorithm or an autonomous navigation AI — is supplied as a standalone component by a specialist supplier, and that supplier places it on the market under their own brand, the supplier is the AI Act provider for that component.

3

Airport security system manufacturer / integrator (§13)

The entity developing and marketing the AI-based security screening system — whether the OEM or a system integrator assembling third-party AI into a branded product — is the provider under Article 25(1)(a). Note: like automotive (Section B), Article 25(3)’s automatic provider-designation for product manufacturers applies only to Section A products.

Map your aviation AI obligations in 30 seconds

Regumatrix generates a full cited compliance analysis — risk classification determination, obligation chain, conformity route, fine exposure — for drone AI, airport security AI, or any aviation system. 3 analyses free, no credit card required.

Generate your compliance report
PROPOSAL — not yet enacted law

COM(2025) 836: EASA coordination amendments + 2028 deadline

The Digital Omnibus Simplification Regulation COM(2025) 836 (Article 2 of the proposal) amends Regulation (EU) 2018/1139 directly — ensuring EASA technical standards account for AI Act requirements. It also extends the compliance deadline and introduces a real-world testing route for Section B products.

7 EASA amendments (836 Art 2) — AI Act awareness

836 Art 2 adds new paragraphs to Articles 27(3), 31(3), 32(3), 36(3), 39(3), 50(3) and 53(3) of Regulation 2018/1139. Each amendment states that when EASA adopts implementing or delegated acts under the relevant article, it must “take into account” the requirements of Chapter III Section 2 of the EU AI Act where those acts cover AI systems that may constitute safety components of unmanned aircraft products.

This creates a coordinated framework so that EASA certification specifications for drone AI formally incorporate the AI Act’s technical requirements (risk management, data governance, accuracy, robustness) — reducing fragmentation between aviation certification and AI Act compliance.

Amended Art 113(d) — Fallback deadline: 2 August 2028

836 Art 1(31) proposes adding point (d) to Article 113: Chapter III obligations for Annex I Article 6(1) systems apply 12 months after a Commission decision confirming adequate compliance support is available. Hard fallback: 2 August 2028 — 28 months away. This applies to all Annex I Section B products including aviation.

New Art 60a — Real-world testing for Section B products

836 proposes a new Article 60a providing a dedicated real-world testing route for Annex I Section B products (including EASA-certified drone AI) via voluntary written agreements between Member States and the Commission. This supplements the existing Art 57–63 regulatory sandbox framework with a Section B-specific pathway particularly suited to aviation testing scenarios.

Annex VIII Section B deleted — Documentation simplification

836 Art 1(32) deletes Section B of Annex VIII (the mandatory harmonised post-market monitoring documentation template for Section B AI systems). This reduces documentation overhead for aviation AI providers by removing the templated compliance log format requirement specific to Section B.

COM(2025) 836 is a Commission proposal pending European Parliament and Council adoption. It is not yet in force. Plan for 2 August 2027 as the working deadline and treat 2028 as conditional on this legislation progressing.

Frequently asked questions

Does the EU AI Act apply to AI in commercial aircraft (autopilot, TCAS, terrain avoidance)?▾

Not under the Article 6(1) Annex I §20 pathway. Annex I §20 is expressly scoped to Regulation (EU) 2018/1139 only 'as far as the design, production and placing on the market of aircraft referred to in Article 2(1)(a) and (b) of that regulation are concerned, where it concerns unmanned aircraft and their engines, propellers, parts and non-installed equipment to control them remotely'. Manned commercial aircraft, autopilot systems, TCAS, and terrain avoidance warning systems in commercial aviation are outside this Annex I pathway. They may, however, be subject to other AI Act obligations (e.g., as general-purpose AI or under Article 50 transparency requirements) or EU-level aviation AI regulatory guidance under EASA's separate authority.

Which drone categories trigger the EU AI Act Article 6(1) classification?▾

EASA categorises UAS operations into three tiers: (1) Open category — low-risk operations subject to product requirements in Reg 2019/945 but no EASA type certification required; AI in open-category drones does not meet Art 6(1)(b) (no mandatory third-party assessment). (2) Specific category — medium-risk requiring a national CAA authorisation (PDRA or SORA); some specific-category operations require national-body third-party assessment — if so, AI safety components may trigger Art 6(1). (3) Certified category — high-risk operations requiring EASA type certificate equivalent to manned aviation; AI safety components in certified-category UAS clearly trigger Art 6(1) as both conditions (safety component in Annex I product + mandatory EASA third-party type certification) are met.

What are Annex I §13 and §20 — the two aviation entries in the EU AI Act?▾

Annex I §13 covers Regulation (EC) No 300/2008 on civil aviation security — the EU framework for airport security screening, access control, and threat detection. AI used as a safety component in systems covered by this regulation (e.g., automated threat detection at security checkpoints) falls within Art 6(1). Annex I §20 covers Regulation (EU) 2018/1139 (the EASA Basic Regulation) but only to the extent it concerns unmanned aircraft. This Regulation establishes the EASA framework for drone design, production, and placing on the market, supported by subsidiary UAS regulations Reg 2019/945 (product requirements) and Reg 2019/947 (operational rules).

Does COM(2025) 836 change EASA's approach to AI Act compliance?▾

Yes. COM(2025) 836 (Art 2 of the Omnibus proposal) proposes 7 coordinating amendments to Regulation (EU) 2018/1139, adding new paragraphs to Articles 27(3), 31(3), 32(3), 36(3), 39(3), 50(3) and 53(3). Each amendment states that when EASA adopts implementing or delegated acts pursuant to the relevant article, it must 'take into account' the requirements of Chapter III Section 2 of the EU AI Act when those acts cover AI systems that may constitute safety components of unmanned aircraft products. This ensures EASA technical standards and certification specifications for drone AI formally account for AI Act requirements — creating a joined-up framework. Additionally, 836 proposes a new Article 60a for real-world testing of Annex I Section B products, and amends Art 113 to provide a 2 August 2028 fallback compliance deadline. COM(2025) 836 is a proposal not yet in force.

Who is the provider of aviation AI under the EU AI Act?▾

For drone AI: the UAS manufacturer that places the drone (or the AI safety component within it) on the EU market under their own name or trademark is the provider. For a certified-category UAS, this is typically the entity holding — or applying for — the EASA type certificate or equivalent. For airport security AI (Annex I §13): the entity that develops and places the AI-based security screening or threat detection system on the market. A system integrator that assembles third-party AI components into a finished security product under their own brand will typically be the provider under Article 25(1)(a) of the AI Act.

What is the compliance deadline for aviation AI under the EU AI Act?▾

Article 113(c) of the EU AI Act (current law) sets the Art 6(1) compliance deadline at 2 August 2027. COM(2025) 836 proposes amending Art 113 to add a point (d): for Annex I Article 6(1) systems, Chapter III obligations apply 12 months after a Commission decision confirming adequate compliance support (harmonised standards, specifications, guidelines) is available. The hard fallback is 2 August 2028. Until COM(2025) 836 is enacted, the definitive deadline remains 2 August 2027. Plan accordingly and treat 2028 as a potential extension contingent on legislative progress.

Related compliance guides

Automotive & Vehicle AI

Art 6(1) Annex I §§18–19 — Reg 2018/858 and GSR 2019/2144.

Medical Device AI (SaMD)

Art 6(1) Annex I §§11–12 — Section A MDR/IVDR dual pathway.

Conformity Assessment Guide

Self-assessment vs notified body — Art 43 full breakdown.

Is My AI High-Risk?

All Art 6(1) + Annex III pathways — one-page decision guide.

Technical Documentation (Annex IV)

What must go in your AI Act technical file.

COM(2025) 836 — Full Overview

All 33 AI Act amendments and 7 EASA amendments in the Omnibus.