EU AI Act Standard Delay Pushed to 2026

EU AI Act Standard Delay Pushed to 2026

gr.euronews.com

EU AI Act Standard Delay Pushed to 2026

The EU AI Act's technical standards development, initially planned for August 2025, is delayed until 2026 due to the complexity of the process and the need for broad stakeholder consensus, impacting companies' ability to prove AI product compliance before the Act's full enforcement in 2027.

Greek
United States
TechnologyEuropean UnionArtificial IntelligenceEu RegulationAi ActComplianceStandardization
Cen-CenelecEuropean CommissionAi Office Of The CommissionDutch Data Protection Authority
Sven Stevens
What is the impact of the delay in developing technical standards for the EU AI Act on businesses?
The development of technical standards for EU AI Act compliance is delayed until 2026, impacting companies' ability to demonstrate conformity. The standards, initially slated for August 2025, are crucial for proving product safety and compliance with EU regulations. This delay pushes back the timeline for businesses to adapt to the new AI Act.
What factors contributed to the delay in the development of these crucial AI Act compliance standards?
CEN-CENELEC, the main standardization bodies, cited the complexity of aligning technological advancements with regulatory needs as a reason for the delay. The process involves multiple rounds of revisions, Commission assessments, and consultations before finalization, extending the timeline beyond the initial 2025 target. This delay affects businesses aiming for timely compliance.
What are the potential implications of this delay in standardization for the enforcement of the AI Act and its effectiveness?
The postponed standardization process may lead to uncertainty and potential non-compliance among companies by the 2027 deadline for the full enforcement of the AI Act. The delay highlights the challenges of creating robust, future-proof standards for rapidly evolving technologies. The implications for businesses and regulators alike will likely involve increased pressure to ensure timely compliance.

Cognitive Concepts

2/5

Framing Bias

The article frames the delay as a significant obstacle, highlighting the concerns of standardization bodies and a Dutch data protection authority official. This emphasis on challenges might inadvertently downplay the progress being made or alternative solutions.

1/5

Language Bias

The language used is relatively neutral and factual. However, phrases like 'time is running out' could be considered slightly loaded, creating a sense of urgency and potential concern. More neutral phrasing could be 'the timeline is constrained'.

3/5

Bias by Omission

The article focuses on the delay of AI standards development and the challenges faced by standardization bodies, but omits discussion of potential alternative approaches to demonstrating AI compliance or the views of companies that might be affected by the delay. It also does not delve into the specifics of what constitutes 'high-risk' AI applications under the AI Act.

2/5

False Dichotomy

The article presents a somewhat simplified view by focusing primarily on the delay and the challenges faced by standardization bodies. It doesn't explore alternative methods of demonstrating AI compliance or the potential impact of the delay on different stakeholders.

Sustainable Development Goals

Industry, Innovation, and Infrastructure Positive
Direct Relevance

The delay in developing technical standards for the EU AI Act, while initially seeming negative, ultimately allows for more thorough development, ensuring alignment with technological advancements and stakeholder consensus. This contributes to a more robust and effective regulatory framework for AI, fostering innovation within a safe and reliable environment. The Act itself promotes responsible development and deployment of AI, which can lead to innovation in various sectors.