RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
HomeComplianceAutomotive & Vehicle AI
Annex I §§18–19 — Reg 2018/858 + 2019/2144€15M / 3% high-risk penaltyArticle 6(1) pathway2 Aug 2027 deadline — 16 months away

EU AI Act for Automotive & Vehicle AI

AI used as a safety component in a motor vehicle regulated under the Motor Vehicle Type-Approval Regulation (Reg 2018/858) or the General Safety Regulation (GSR, Reg 2019/2144) is high-risk under Article 6(1) of the EU AI Act. This covers ADAS systems mandated by the GSR — AEB, lane-keep assist, ISA, driver monitoring, and more. Conformity assessment follows the EU vehicle type-approval route. COM(2025) 836 proposes extending the compliance deadline to 2 August 2028.

Art 6(1) classification: full Chapter III obligations from 2 August 2027

Violations of high-risk AI obligations under Chapter III carry the Art 99(4) penalty: up to €15 million or 3% of global annual turnover, whichever is higher. Non-compliance with the vehicle type-approval conformity assessment — including its AI Act dimensions — also constitutes a breach of Regulations 2018/858 and 2019/2144 with separate enforcement consequences.

Is your automotive AI high-risk under Art 6(1)?

Regumatrix maps your system description against all classification rules — including the Art 6(1) Annex I pathway for automotive safety components — and identifies your exact conformity route, obligation chain, and fine exposure in a cited 8-section report in about 30 seconds.

Check in 30 seconds — 3 free analyses

The Article 6(1) pathway — how it applies to vehicle AI

Most EU AI Act high-risk content focuses on Art 6(2) and Annex III — the eight sector-specific domains. But Article 6(1) is a separate, parallel pathway for AI embedded in products that already undergo mandatory third-party conformity assessment under EU product safety law. Vehicles are covered by two Annex I entries.

Article 6(1) — HIGH-RISK (two conditions must both be met)

(a) The AI system is intended to be used as a safety component of a motor vehicle — or is itself a product — covered by Union harmonisation legislation listed in Annex I of the EU AI Act; AND

(b) that vehicle is required to undergo a third-party conformity assessment (EU type-approval via a designated technical service) under that Annex I legislation.

Relevant Annex I entries for automotive AI

  • §18 — Reg 2018/858: Regulation (EU) 2018/858 on approval and market surveillance of motor vehicles and trailers, and systems, components and separate technical units intended for such vehicles. This is the overarching EU type-approval framework.
  • §19 — Reg 2019/2144 (GSR): Regulation (EU) 2019/2144 laying down type-approval requirements for motor vehicles and their trailers regarding general safety and the protection of vehicle occupants and vulnerable road users. The GSR mandates specific ADAS systems — many of which rely on AI.

In scope — typical automotive AI examples

  • • Autonomous Emergency Braking (AEB) — AI vision+radar fusion
  • • Lane Keeping Assist (LKA) — camera-based lane detection
  • • Intelligent Speed Assistance (ISA) — speed sign recognition AI
  • • Driver Drowsiness & Attention Warning (DDAW) — driver monitoring AI
  • • Advanced Driver Distraction Warning (ADDW) — interior camera AI
  • • Pedestrian/cyclist detection AI in reversing detection systems
  • • Sensor fusion AI for parking assistance
  • • ADAS perception AI embedded in ECUs classified as separate technical units (STUs)

Outside Art 6(1) scope — may still be AI Act relevant

  • • Infotainment AI (navigation, voice assistant) — no safety component function
  • • Predictive maintenance AI not in vehicle type-approval scope
  • • AI in automotive manufacturing/quality control (factory automation)
  • • Fleet management AI operating post-sale with no safety role
  • • AI in vehicles that do not require EU type-approval (certain off-road, military, historic vehicles)

Note: AI systems that fall outside Art 6(1) may still be subject to other AI Act obligations — for example, AI used to manage road traffic or critical transport network infrastructure may be high-risk under Annex III §2 (critical infrastructure AI). Always check both pathways.

GSR-mandated ADAS: AI systems required by Regulation 2019/2144

Regulation (EU) 2019/2144 (the GSR) requires progressive introduction of safety technologies across new vehicle categories. Most of the mandated systems depend on AI perception, prediction, or decision-making algorithms. The table below lists the GSR systems and their typical AI dependency.

GSR SystemTypical AI ComponentArt 6(1) Trigger?
Autonomous Emergency Braking (AEB)Vision/radar fusion + collision prediction modelYes — if type-approval required
Lane Keeping Assist (LKA)Camera-based lane marking detection AIYes — if type-approval required
Intelligent Speed Assistance (ISA)Road sign recognition AI + map data fusionYes — if type-approval required
Driver Drowsiness & Attention Warning (DDAW)Steering pattern or eye-tracking AIYes — if type-approval required
Advanced Driver Distraction Warning (ADDW)Interior camera AI monitoring driver head/gazeYes — if type-approval required
Reversing DetectionObject/pedestrian detection AI in rear camera systemYes — if AI-based safety component
Event Data Recorder (EDR)May use AI for trigger detection; data loggingOnly if AI-based trigger

The GSR introduced mandatory ADAS requirements in phases. Phase 1 applied from July 2022 for new type-approvals; Phase 2 extends requirements further from 2024 onwards. Vehicles placed on the market must comply, meaning their AI safety components must comply with the AI Act from 2 August 2027.

Conformity assessment: EU type-approval procedure applies

For automotive AI systems, the AI Act does not prescribe a separate AI-specific conformity procedure. Instead, the EU type-approval framework under Regulation 2018/858 — which already requires third-party assessment by a designated technical service — is the relevant conformity route. The AI Act Chapter III Section 2 requirements (Arts 9–15) apply to the AI system and must be addressed within that process.

How it differs from Section A (medical devices, machinery)

  • • Article 43(3) applies only to Annex I Section A products. It explicitly routes AI Act compliance through the sectoral notified body only for Section A legislations (MDR, IVDR, Machinery Directive, etc.).
  • • Automotive and aviation fall under Annex I Section B — a distinct category of Union harmonisation legislation not covered by Art 43(3). No equivalent explicit routing provision exists for Section B in the AI Act text.
  • • In practice, the vehicle type-approval technical service (TÜV, DEKRA, UTAC, VCA, etc.) will assess AI Act compliance as part of the broader type-approval procedure, since compliance with all applicable Union law is a condition of type-approval under Regulation 2018/858.
  • • The full set of Chapter III Section 2 obligations apply to the AI system: risk management (Art 9), data governance (Art 10), technical documentation (Art 11), logging (Art 12), transparency (Art 13), human oversight (Art 14), accuracy and robustness (Art 15).

Practical steps for automotive AI providers

  1. Establish a risk management system (Art 9) for the AI system specific to its automotive safety function.
  2. Document data governance practices for training, validation and test datasets (Art 10), including bias testing relevant to road user demographics.
  3. Prepare technical documentation per Annex IV (Art 11) — this can be integrated into the existing type-approval technical file under Regulation 2018/858.
  4. Ensure automatic logging capabilities (Art 12) are built in — important for incident investigation and post-market monitoring.
  5. Provide deployer-facing instructions for use including expected accuracy metrics and operational constraints (Art 13).
  6. Implement human oversight mechanisms (Art 14) — e.g., override capability, driver take-over warnings.
  7. Register the AI system in the EU database (Art 49) before placing on the market.
  8. Affix CE marking and draw up EU declaration of conformity (Art 47, Art 48).

Who is the “provider” in the automotive supply chain?

The automotive AI value chain involves multiple tiers. Determining who bears AI Act provider obligations (Art 16) requires careful analysis of who places the AI system on the market and under whose name.

1

Vehicle OEM (integrator + brand owner)

Where the OEM integrates an ADAS AI system and markets it as part of its vehicle under its own brand, the OEM is the AI Act provider. It also holds the type-approval, creating a natural alignment between type-approval holder and AI Act provider obligations.

2

Tier 1 supplier (AI system as separate technical unit)

A Tier 1 supplier (e.g., Bosch, Continental, Mobileye, ZF) that develops an ADAS AI system marketed under their own brand as a separate technical unit (STU) under Regulation 2018/858 is typically the AI Act provider. The Tier 1 must ensure the AI system meets Arts 9–15 and has undergone conformity assessment before placing the STU on the market.

3

OTA update provider (post-market modification)

Any entity (OEM, software supplier, or third party) that issues an over-the-air (OTA) software update that constitutes a substantial modification of a deployed AI system becomes a new provider under Art 25(1)(b) and must trigger a new conformity assessment procedure (Art 43(4)).

Note: Art 25(3) — which automatically deems the product manufacturer to be the AI system provider — applies only to AI systems that are safety components of products covered by Annex I Section A. Motor vehicles fall under Section B, so Art 25(3) does not automatically apply. Provider identity must instead be determined by the standard Art 25(1) rules.

OTA software updates and substantial modification

Over-the-air (OTA) updates are increasingly common in modern vehicles. Under Art 43(4), a high-risk AI system that has already passed conformity assessment must undergo a new conformity assessment in the event of a substantial modification.

OTA updates that trigger new conformity assessment

  • • Changes to the AI model’s intended purpose
  • • Retraining the AI model on new or expanded datasets
  • • Changes that materially alter accuracy, robustness, or safety performance
  • • Adding new safety-relevant AI capabilities not declared at initial assessment
  • • Modifications that go beyond what was pre-defined in the Annex IV technical documentation

OTA updates that do NOT require new conformity

  • • Bug fixes and security patches
  • • Changes pre-determined by the provider at initial assessment and documented in the technical file (Annex IV, point 2(f))
  • • Incremental learning updates within pre-defined parameters documented at time of assessment
  • • Map data updates for ISA (not AI model changes)

The EU vehicle type-approval framework under Regulation 2018/858 already includes processes for managing software updates and cybersecurity (UN Regulation No 156 on Software Update Management Systems). AI Act and vehicle cybersecurity obligations should be managed together in an integrated software update governance process.

Full Chapter III obligations for automotive AI providers

Once an AI system is classified as high-risk under Article 6(1), the provider must comply with the full set of obligations under Chapter III, Section 2 (technical requirements) and Section 3 (provider obligations), both from 2 August 2027.

Art 9

Risk Management System

Continuous risk management process covering the full AI lifecycle — including foreseeable misuse scenarios such as driver over-reliance on ADAS.

Art 10

Data & Data Governance

Training, validation and test datasets must be sufficiently representative — including diverse road conditions, lighting, weather, and vulnerable road user demographics.

Art 11

Technical Documentation

Pre-market documentation per Annex IV. For type-approved vehicles, this can be integrated into the existing type-approval technical file (Art 11(2) enables this for Section A products; best practice for Section B).

Art 12

Record-keeping (Logging)

Automatic logging of system events — particularly triggering events for AEB, LKA, ISA — stored for post-incident investigation and in-service monitoring.

Art 13

Transparency

Instructions for use for deployers covering AI capabilities, accuracy metrics, known limitations (e.g., AEB performance in adverse weather), and human oversight requirements.

Art 14

Human Oversight

Technical measures enabling drivers to override or disable ADAS functions where permitted, and to understand when AI is controlling vehicle functions.

Art 15

Accuracy, Robustness & Cybersecurity

Consistent performance across diverse operating conditions. Resilience against adversarial inputs (data poisoning, model evasion) — particularly relevant for camera-based perception systems.

Art 16

Provider Obligations

Quality management system (Art 17), documentation keeping for 10 years (Art 18), post-market monitoring, incident reporting, CE marking (Art 48) and EU database registration (Art 49).

Map your automotive AI against all 7 Chapter III requirements

Regumatrix generates a full cited 8-section compliance report — risk classification, applicable obligations, conformity route, fine exposure, and a prioritised action list — in about 30 seconds.

Generate your compliance report — 3 free analyses
PROPOSAL — not yet enacted law

COM(2025) 836: what changes for automotive AI

The Digital Omnibus Simplification Regulation COM(2025) 836 proposes significant changes to the AI Act timetable and adds coordination provisions for Annex I Section B products such as motor vehicles.

Amended Art 113(d) — Deadline moved to 2 August 2028

836 Art 1(31) adds point (d) to Article 113: Chapter III Sections 1, 2, and 3 shall apply 12 months after a Commission decision confirming adequate compliance support measures (harmonised standards, common specifications, guidelines) are available for Annex I Article 6(1) systems. In the absence of a Commission decision, or where the fallback date is earlier, Chapter III obligations apply on 2 August 2028 — 28 months away.

New Art 60a — Real-world testing for Section B products

836 proposes a new Article 60a creating a dedicated real-world testing route for Annex I Section B products — including automotive type-approval systems — via voluntary written agreements between interested Member States and the Commission. This provides a dedicated sandbox alternative to the existing Art 57–63 framework for Section B automotive AI development and testing.

Annex VIII Section B deleted — Simplified documentation

836 Art 1(32) deletes Section B of Annex VIII (the mandatory post-market monitoring documentation template for AI systems covered by Annex I Section B). This reduces documentation burden for automotive AI providers by removing the templated compliance log format obligation specific to Section B systems.

COM(2025) 836 is a Commission proposal pending European Parliament and Council agreement. It is not yet in force. Plan for 2 August 2027 as the working compliance deadline and treat 2028 as a potential extension contingent on legislative progress.

Frequently asked questions

Which automotive AI systems are high-risk under Article 6(1) of the EU AI Act?▾

An AI system is high-risk under Article 6(1) where two conditions are both met: (a) the AI is a safety component of a motor vehicle — or is itself a product — covered by Annex I §18 (Regulation (EU) 2018/858 on type-approval of motor vehicles) or §19 (Regulation (EU) 2019/2144, the General Safety Regulation); and (b) that vehicle is required to undergo third-party conformity assessment (EU type-approval via a designated technical service). Systems mandated by the GSR — AEB, lane-keep assist, ISA, driver drowsiness warning, event data recorder — are AI-driven and will typically satisfy both conditions.

What AI systems does the EU General Safety Regulation (2019/2144) mandate?▾

Regulation (EU) 2019/2144 (GSR) mandates progressive introduction of several ADAS technologies: Intelligent Speed Assistance (ISA), Lane Keeping Assist (LKA), Autonomous Emergency Braking (AEB), Driver Drowsiness and Attention Warning (DDAW), Advanced Driver Distraction Warning (ADDW), Event Data Recorder (EDR), Reversing Detection (rear-view camera or sensors), and Emergency Stop Signal. Many of these systems rely on AI models — computer vision, sensor fusion, predictive algorithms — making them prospective safety components under Annex I §§18–19.

Which conformity assessment route applies to automotive AI systems?▾

Automotive AI systems that are high-risk under Article 6(1) must undergo conformity assessment as required under the applicable Annex I legislation — the EU type-approval procedure under Regulation 2018/858. A designated technical service (TÜV, DEKRA, UTAC, VCA, etc.) acts as the third-party assessor. Unlike Section A products (medical devices, machinery) where Article 43(3) explicitly routes the AI Act assessment through the sectoral notified body, Section B products (including automotive) are assessed through the underlying type-approval process, which must incorporate the AI Act Chapter III Section 2 requirements (Arts 9–15). Note: Article 43(3) by its terms applies only to Section A of Annex I.

Who is the 'provider' of automotive AI under the EU AI Act?▾

In the automotive supply chain: (1) If a vehicle OEM integrates and markets the AI system under its own brand (e.g., as part of its ADAS suite), the OEM is the provider; (2) If a Tier 1 supplier (e.g., Bosch, Continental, Mobileye) places the AI system on the market independently, the Tier 1 is the provider; (3) Any party that substantially modifies an existing AI system and places it back on the market becomes the new provider under Article 25(1)(b). Note: Article 25(3) — which deems the product manufacturer to be the AI system provider — applies specifically to Section A of Annex I, not Section B automotive systems.

Do OTA software updates trigger new conformity assessment for automotive AI?▾

Yes. Under Article 43(4), a high-risk AI system that has already undergone conformity assessment must undergo a new conformity assessment in the event of a 'substantial modification'. For automotive AI, an OTA update that alters the AI model's intended purpose, changes its capability in a material way, or introduces new training datasets beyond what was pre-declared in the technical documentation will constitute a substantial modification requiring reassessment. Only changes pre-determined at the time of initial assessment and documented in Annex IV technical documentation are exempt.

When does the compliance deadline apply, and what does COM(2025) 836 propose for automotive AI?▾

Article 113(c) of the current EU AI Act sets the compliance deadline for Article 6(1) systems — including automotive AI — at 2 August 2027. COM(2025) 836 (the Omnibus proposal) proposes adding point (d) to Article 113: Chapter III obligations for Annex I Article 6(1) systems would apply 12 months after a Commission decision confirming adequate compliance support (harmonised standards, guidelines) is available. The hard fallback date, applicable if no such Commission decision is adopted in time, is 2 August 2028. Additionally, 836 proposes a new Article 60a creating a dedicated real-world testing route for Annex I Section B products (including automotive type-approval systems) via voluntary written agreements between Member States and the Commission. COM(2025) 836 is a proposal pending Council and Parliament agreement and is not yet law.

Related compliance guides

Medical Device AI (SaMD)

Art 6(1) Annex I §§11–12 — MDR/IVDR dual compliance pathway.

Aviation AI (EASA)

Art 6(1) Annex I §20 — unmanned aircraft and EASA certification.

Conformity Assessment Guide

Self-assessment vs notified body — full breakdown of Art 43.

Is My AI High-Risk?

All Art 6(1) + Annex III pathways — one-page decision guide.

Technical Documentation (Annex IV)

What must go in your AI Act technical file.

COM(2025) 836 — Full Overview

All 33 AI Act amendments and 7 EASA amendments in the Omnibus.