Under publication
ISO/IEC 42005
Information technology — Artificial intelligence — AI system impact assessment
Reference number
ISO/IEC 42005
Edition 1
Under publication
ISO/IEC 42005
44545
Final production steps (up to seven weeks).

What is ISO/IEC 42005?

ISO/IEC 42005 provides guidance for organisations conducting AI system impact assessments. These assessments focus on understanding how AI systems — and their foreseeable applications — may affect individuals, groups, or society at large. The standard supports transparency, accountability and trust in AI by helping organisations identify, evaluate and document potential impacts throughout the AI system lifecycle.

Why is ISO/IEC 42005 important?

AI technologies are rapidly reshaping industries, economies and daily life — offering immense benefits, but also raising ethical, social and environmental concerns. ISO/IEC 42005 plays a crucial role in ensuring these impacts are responsibly addressed. By guiding organisations through structured impact assessments, it enables them to align AI development with values such as fairness, safety, and human-centred design. It also supports broader governance and risk management practices, reinforcing trust and societal acceptance of AI systems.

Benefits

  • Strengthens stakeholder trust through transparent impact documentation
  • Supports responsible innovation by addressing social and ethical risks
  •  Enhances alignment with governance, risk and compliance frameworks
  • Improves internal decision-making and accountability across the AI lifecycle
  • Encourages consistency and clarity in AI-related impact reporting

 

FAQ

Any organisation developing, providing or using AI systems — regardless of sector or size — that wants to assess and manage the potential impacts of their AI systems on people and society.

It complements standards like ISO/IEC 42001 (AI management systems), ISO/IEC 38507 (AI governance), and ISO/IEC 23894 (AI risk management) by focusing specifically on the societal and human impacts of AI.

The standard recommends performing assessments throughout the AI system lifecycle — from design and development to deployment and post-market monitoring — and updating them as needed.

General information

  •  : Under development
    : International Standard under publication [60.00]
  •  : 1
  • ISO/IEC JTC 1/SC 42
    35.020 
  • RSS updates

Got a question?

Check out our Help and Support