ISO/IEC 23894 provides guidelines for managing risks associated with the development and use of AI systems.
ISO/IEC 23894:2023 applies the ISO 31000 framework to the specific context of artificial intelligence. It addresses technical, ethical, social and governance risks inherent to AI systems throughout their lifecycle.
No. It is a guidelines standard that complements ISO 42001 (which is certifiable). It provides risk methodology but not auditable requirements.
Provides a structured framework for meeting risk assessment requirements that the EU AI Act demands for high-risk AI systems.
Need an assessment in this area?