RegumatrixBeta
GuidesPathfinderAI RightsFreeAbout
Sign inGet Started Free

Reference

  • All Articles
  • Official Text ↗

Compliance Guides

  • Compliance Timeline
  • High-Risk Checklist
  • Healthcare AI
  • HR & Recruitment
  • Financial Services
  • GPAI / Foundation Models
  • View all guides →

Product

  • Risk Pathfinder
  • AI Rights Check
  • Get Started Free
  • About
  • Feedback
  • Contact

Legal

  • Privacy Policy
  • Terms

Regumatrix — AI compliance powered by Regulation (EU) 2024/1689

This tool is informational only and does not constitute legal advice.

Grounded in Regulation (EU) 2024/1689 · verified 4 Apr 2026
  1. Home
  2. /
  3. Compliance
  4. /
  5. EU AI Act vs NIS2
Inter-regulation comparison837 — single-entry incident reportingCritical infrastructure: both apply

EU AI Act vs NIS2: Cybersecurity, Critical Infrastructure AI & Incident Reporting

For operators of critical infrastructure — energy grids, water systems, transport networks, digital infrastructure — AI systems used in safety roles are simultaneously subject to both the EU AI Act and the NIS2 Directive. AI Act Article 15 cybersecurity requirements and NIS2 Art 21 security measures pursue the same systems from different angles. COM(2025) 837 provides partial relief: a single-entry ENISA notification platform for NIS2, GDPR, DORA, eIDAS and CER incidents — though AI Act Article 73 serious incident reporting remains a separate obligation.

NIS2 Directive

Directive (EU) 2022/2555 — transposition deadline: October 2024

  • ▸Security of network and information systems at scale
  • ▸Two tiers: essential entities and important entities
  • ▸ICT risk management framework (Art 21)
  • ▸Mandatory incident reporting — 24h/72h/30d (Art 23)
  • ▸Supply chain and ICT procurement security
  • ▸Board-level accountability for cybersecurity
  • ▸ENISA coordination and cross-border NIS CG
  • ▸Sector-specific national competent authorities

EU AI Act

Regulation (EU) 2024/1689 — most high-risk obligations from August 2026

  • ▸Risk tiers for individual AI systems
  • ▸Annex III §2: critical infrastructure AI = high-risk
  • ▸Conformity assessment + CE marking
  • ▸Risk management and data governance
  • ▸Cybersecurity, accuracy, robustness (Art 15)
  • ▸Human oversight (Art 14)
  • ▸Serious incident reporting to MSA (Art 73)
  • ▸Supply chain documentation chain (Arts 24–26)

Side-by-side comparison

AspectNIS2EU AI Act
What it regulatesSecurity of network and information systems — for entities in sectors critical to society and economyAI systems — based on risk to health, safety, and fundamental rights of persons
Who it applies toEssential entities (energy, transport, banking, financial market infrastructure, health, digital infrastructure, public administration, space) + important entities (digital providers, manufacturing, postal, food)Providers (develop/place AI on market), deployers (use AI in professional context), importers, distributors — regardless of sector
Cybersecurity obligationArt 21: proportionate technical/organisational measures — risk analysis, incident handling, supply chain security, encryption, access control, security testingArt 15: high-risk AI must achieve accuracy, robustness and cybersecurity; resilient against data poisoning, model poisoning, adversarial examples, confidentiality attacks throughout lifecycle
Incident reportingArt 23: significant incidents → 24h early warning → 72h assessment → 30-day final report to national NIS2 competent authorityArt 73: serious incidents (death, serious harm, significant property/rights breach) → provider notifies national market surveillance authority without undue delay
Supply chain / third partiesArt 21(2)(d): essential entities must address security in supply chain including ICT service providers and product manufacturersArts 24–26: clear obligation allocation between providers and downstream parties; deployers entitled to technical documentation, instructions for use, conformity declarations
Enforcement authorityNational NIS2 competent authorities (sector-specific); ENISA for coordination; cross-border under NIS Cooperation GroupNational market surveillance authorities; EU AI Office for GPAI models and (under 836) VLOP-integrated AI
Penalty maximumEssential entities: €10M or 2% of global annual turnover (whichever higher); important entities: €7M or 1.4%€35M/7% for prohibited AI; €15M/3% for high-risk obligation violations; €7.5M/1.5% for incorrect information
Applicability dateNational transposition deadline: October 2024 (Directive — Member States implement)February 2025 (prohibitions); August 2026 (most high-risk obligations); August 2027 (some deployer-side obligations)

Key overlap zones

Critical infrastructure AI — Annex III §2 ↔ NIS2 essential entities

AI Act Annex III §2 designates as high-risk any AI system used as a safety component in the management and operation of critical digital infrastructure, road traffic management, water supply, gas, heating, or electricity. The organisations operating these systems — water utilities, energy providers, transport operators, internet exchange points — are exactly the NIS2 essential entities required to meet NIS2 Art 21 cybersecurity measures and Art 23 incident reporting. The same AI system deployments trigger high-risk AI Act obligations AND must be covered by the entity's NIS2 ICT risk management framework. Both apply independently.

Cybersecurity requirements — AI Act Art 15 ↔ NIS2 Art 21

AI Act Art 15 is an AI system-level cybersecurity requirement applied to providers of high-risk AI: accuracy, robustness, and resilience against adversarial attacks (data poisoning, model poisoning, adversarial examples). NIS2 Art 21 is an entity-level ICT risk management obligation applied to essential and important entities: supply chain security, ICT procurement security, encryption. These are complementary, not duplicative. The AI Act's Art 15 obligations ensure the AI component is secure; NIS2 Art 21 ensures the wider network and information environment in which the AI operates is secure. An AI system with EU Cybersecurity Act certification is presumed to comply with AI Act Art 15 to the extent covered — which can help NIS2 entities demonstrate their ICT procurement applied appropriate security standards.

Incident reporting — Art 73 ↔ NIS2 Art 23

These operate in parallel, applying to different parties for the same incident. AI Act Art 73: the AI system provider must report a serious incident to the national market surveillance authority. NIS2 Art 23: the essential/important entity (often the deployer) must report a significant incident to the NIS2 competent authority within 24h, 72h, and 30 days. If a critical infrastructure operator's AI system malfunctions and causes harm, the AI provider reports under Art 73 and the infrastructure operator reports under NIS2 Art 23. These are not the same report — they go to different authorities and describe different aspects of the same event.

Supply chain security — AI Act Arts 24–26 ↔ NIS2 Art 21(2)(d)

NIS2 essential entities must assess the security practices of their ICT suppliers — including AI system providers. AI Act provider/deployer obligations create a documentation chain: providers must supply technical documentation, conformity declarations, and instructions for use (Art 26(6) deployer right). NIS2 entities can leverage this documentation when conducting ICT supply chain due diligence. In practice, an AI system provider to a NIS2 essential entity is simultaneously an AI Act provider (obligated to provide documentation under Art 11 / Art 26) and an ICT third-party service provider (subject to NIS2 supply chain risk assessment by the entity).

Incident reporting: who reports what, to whom

ObligationWho reportsTo whomTimeline
NIS2 Art 23The essential/important entity (typical deployer)National NIS2 competent authority24h / 72h / 30 days
AI Act Art 73AI system providerNational MSA (market surveillance authority)Without undue delay
837 single-entry (when operational)Essential/important entityENISA platform → distributed to NIS2 + GDPR + DORA + eIDAS + CER authoritiesVia ENISA platform; same underlying timelines apply
AI Act Art 73 after 837AI system provider (unchanged)National MSA — NOT covered by ENISA platformUnchanged — separate obligation

Are you a NIS2 essential entity deploying AI in critical infrastructure?

Regumatrix maps your AI systems against Annex III §2, identifies dual obligations under AI Act and NIS2, and generates a combined compliance checklist cross-referencing Art 15 cybersecurity and NIS2 Art 21 measures.

Check my AI Act + NIS2 obligations — 3 free analyses

COM(2025) 837: NIS2 single-entry point

PROPOSAL — not yet enacted lawCOM(2025) 837 — NIS2 Art 23a: ENISA single-entry notification point

Today, an entity subject to multiple notification obligations must file separate reports to separate authorities for the same incident. 837 proposes to end this by creating one platform covering all covered acts.

New NIS2 Art 23a — ENISA single-entry point

  • ▸ENISA develops and maintains a single-entry notification platform covering: NIS2 + GDPR + DORA + eIDAS + CER
  • ▸Entities subject to multiple of these acts submit one notification for a single incident — ENISA distributes to each relevant competent authority
  • ▸ENISA must have the platform operational and piloted within 18 months of 837 entering into force
  • ▸Commission publishes a notice in the Official Journal when the platform becomes operational

NIS2 Art 23(1) and 30(1) amendments

  • ▸Art 23(1) amended: mandatory significant incident notifications must be submitted via the Art 23a single-entry platform once it is operational
  • ▸Art 30(1) amended: voluntary notifications — cyber threats, near misses, incidents below the significant threshold — also submitted via Art 23a
  • ▸NIS2 essential entities that are also GDPR data controllers can submit one notification for a personal data breach incident via the single-entry point: satisfies both NIS2 Art 23 and GDPR Art 33 notification simultaneously

What the single-entry point does NOT cover

The single-entry point covers NIS2, GDPR, DORA, eIDAS, and CER. The EU AI Act Art 73 serious incident reporting is not included — it remains a separate obligation to the national market surveillance authority. For a NIS2 essential entity that is also the AI system provider, an AI malfunction may require: (1) a NIS2 Art 23 report via ENISA single-entry point, and (2) a separate AI Act Art 73 report to the national MSA. Monitoring for potential extension of the single-entry point to AI Act Art 73 is recommended.

Practical compliance map for critical infrastructure operators

1

Confirm NIS2 essential / important entity status

Verify your sector maps to NIS2 Annex I (essential) or Annex II (important). Energy, water, transport, and digital infrastructure operators are essential entities. This triggers enhanced NIS2 obligations including Art 21 security measures, Art 23 incident reporting, and board accountability.

2

Identify Annex III §2 AI systems

Audit AI deployments used as safety components in management and operation of critical digital infrastructure, road traffic, water, gas, heating, or electricity networks. These are automatically high-risk under the AI Act and require conformity assessment, CE marking, technical documentation, and human oversight.

3

Map Art 15 against Art 21 cybersecurity measures

Compare AI Act Art 15 requirements (data poisoning resilience, model robustness, adversarial attack resistance) against your NIS2 Art 21 ICT risk management framework. Identify where a single security measure addresses both. Consider EU Cybersecurity Act certification — certified AI systems are presumed compliant with Art 15 to the extent covered.

4

Establish dual incident reporting procedures

Prepare separate incident report procedures: NIS2 Art 23 (entity → NIS2 authority: 24h/72h/30d) and AI Act Art 73 (provider → national MSA). If you are both the deployer and provider, both obligations apply. When 837's single-entry point becomes operational, consolidate NIS2/GDPR/DORA notifications there — AI Act Art 73 stays separate.

5

Apply NIS2 supply chain due diligence to AI providers

For each AI system provider whose product is used in NIS2-critical operations, conduct NIS2 Art 21(2)(d) supply chain risk assessment. Request AI Act technical documentation (Art 11), conformity declaration (Art 47), and instructions for use (Art 26(6)) as part of procurement due diligence.

Frequently Asked Questions

Can an organisation be subject to both the EU AI Act and NIS2?+

Yes, and this is common. NIS2 applies to essential entities (energy, transport, banking, health, digital infrastructure) and important entities (digital providers, manufacturing, food, postal). If any of those organisations use AI systems that qualify as high-risk under Annex III of the AI Act — for example, AI systems used to manage critical digital infrastructure or road traffic (Annex III §2) — they face dual obligations: AI Act Chapter III requirements (risk management, conformity assessment, technical documentation, human oversight) and NIS2 requirements (ICT risk management framework, incident reporting, supply chain security). Both apply independently and simultaneously.

Does AI Act Article 15 on cybersecurity overlap with NIS2 Article 21?+

Yes, significantly. AI Act Article 15 requires that high-risk AI systems achieve appropriate levels of accuracy, robustness, and cybersecurity — including resilience against data poisoning, model poisoning, adversarial examples, and confidentiality attacks. NIS2 Article 21 requires essential and important entities to take risk-proportionate technical and organisational measures addressing supply chain security, ICT security in procurement, and security of network and information systems. For a NIS2 essential entity that deploys high-risk AI, both sets of cybersecurity obligations apply to the same system. The EU Cybersecurity Act provides a helpful bridge: AI systems certified under a Cybersecurity Act scheme are presumed compliant with AI Act Art 15 to the extent the certification covers them.

If an AI system causes an incident, which regulation's reporting obligation applies?+

Both can apply, but to different parties. AI Act Art 73 requires the AI system provider to report serious incidents (causing or likely to cause death, serious harm, significant property damage, or fundamental rights breach) to the national market surveillance authority. NIS2 Art 23 requires the essential or important entity deploying the system to report significant incidents to the NIS2 competent authority: 24-hour early warning, 72-hour assessment, and 30-day final report. If the deploying entity is also the AI system provider, it must file both types of reports to two different authorities.

What does COM(2025) 837 change about NIS2 incident reporting?+

COM(2025) 837 proposes to insert Article 23a into NIS2, establishing an ENISA-maintained single-entry point for incident notifications. When this platform is operational, entities subject to multiple reporting obligations — NIS2, GDPR, DORA, eIDAS, CER — can submit a single notification covering all applicable obligations for the same incident. ENISA pilots the system within 18 months of the 837 Regulation entering into force, with a Commission OJ notice when it becomes operational. Notably, AI Act Art 73 incident reporting is not included in the single-entry point — AI Act serious incident notifications remain a separate obligation to the national market surveillance authority.

Are AI providers considered ICT third-party service providers under NIS2?+

In practice, yes. NIS2 Article 21(2)(d) requires essential and important entities to address supply chain security — including the security practices of their ICT service and product suppliers. An AI system provider whose product controls industrial processes, manages digital infrastructure, or supports healthcare services for a NIS2-essential entity qualifies as an ICT service provider whose practices fall within the NIS2 supply chain security requirement. AI Act Arts 24–26 (provider/deployer obligations) create a parallel information-sharing chain. Deployers can leverage AI Act technical documentation and instructions for use when discharging NIS2 supply chain due diligence on AI system providers.

Related Compliance Guides

High-Risk AI Systems — Chapter III

Annex III §2 — critical infrastructure AI triggers full high-risk obligations.

EU AI Act vs GDPR

Critical infrastructure AI may process personal data — GDPR and AI Act overlap.

EU AI Act vs DORA

Banks and financial infrastructure face dual DORA + AI Act obligations — also covered by 837 single-entry point.

AI Act Market Surveillance & Enforcement

AI Act Art 73 serious incident reporting to national MSA — not covered by ENISA platform.

Digital Omnibus 837 — Full Overview

Complete overview of COM(2025) 837 including the ENISA single-entry notification platform.

Data Breach Notification Changes (837)

NIS2 + GDPR notifications via a single platform — benefits for critical infrastructure operators.

Map your AI Act and NIS2 obligations in one workflow

Regumatrix identifies every AI system in your critical infrastructure operations, maps them against Annex III §2, and generates a combined AI Act + NIS2 compliance checklist covering cybersecurity, incident reporting, and supply chain security.

Start free analysis