Cargando
Preparando la información solicitada…
Cargando
Preparando la información solicitada…
Independent AI governance assessment. System inventory, traceability, algorithmic risks, and evidence-based remediation plan. Not opinion: auditable evidence.

"Auditing AI is not about opining on algorithms. It is about verifying governance, risks, and traceability with evidence."Fernando Arrieta — Lead Auditor ISO/IEC 42001
Any organization that uses, develops, or contracts artificial intelligence systems and needs to demonstrate control, compliance, or accountability.
They need evidence that AI is governed: risks identified, controls implemented, and human oversight operational.
AI introduces new risk vectors: data access, vendor dependency, Shadow AI. They need a verifiable map.
Regulation (EU) 2024/1689 requires documentation, risk classification, and oversight. The audit generates compliance evidence.
AI policy, roles and responsibilities (RACI), human oversight, accountability, and alignment with organizational strategy.
Risk identification and assessment: bias, opacity, explainability, vendor dependency, impact on fundamental rights.
Training data quality, lineage, versioning, ownership, consent, anonymization, and access controls.
Production monitoring, incident management, rollback, logging, alerts, and escalation procedures.
AI vendor evaluation, contracts, SLAs, portability, dependency, and concentration risk.
PDCA cycle applied to AI: performance metrics, management review, corrective actions, and documented lessons learned.
ISO/IEC 42001:2023 — Artificial Intelligence Management System (AIMS). Requirements, controls, and Annex A control objectives. The international reference standard for auditable AI governance.
ISO/IEC 23894:2023 — AI risk management guidance. Complements ISO 31000 with specific criteria for algorithmic, data, and social impact risks.
Regulation (EU) 2024/1689 — European AI Regulation. Risk classification, documentation obligations, human oversight, and transparency. A global regulatory benchmark.
ISO/IEC 27001 + 27701 — Information security and privacy. AI does not operate in a vacuum: ISO 27001 access controls, encryption, and traceability are prerequisites for governed AI.
Each deliverable is a governance tool, not a document to file away.
Complete registry of all AI systems in use: purpose, data, vendors, owners, and risk level.
Risk assessment per system: bias, opacity, vendor dependency, data quality, and regulatory exposure.
Findings prioritized by severity with documentary evidence, evaluation criteria, and remediation timelines.
Summary for leadership and boards: governance status, critical risks, and recommended actions.
Prioritized action plan with owners, timelines, resources, and closure verification criteria.
AI management system evolution roadmap: from current state to ISO/IEC 42001 conformity.
Take this quick assessment to understand your organization's exposure to AI risks under ISO/IEC 42001 & EU AI Act.
Step 1 of 4
It is a systematic and independent evaluation of the use of artificial intelligence in an organization. It reviews AI systems inventory, governance, algorithmic risks, data quality, bias, transparency, and controls. The primary reference framework is ISO/IEC 42001 (AIMS).
ISO/IEC 42001 establishes the requirements for an Artificial Intelligence Management System (AIMS). It defines how an organization should govern, manage risks, and continuously improve the use of AI. It is the international reference standard for auditable AI governance.
The European AI Regulation classifies systems by risk level and requires documentation, human oversight, transparency, and risk management. An audit under ISO/IEC 42001 generates evidence compatible with these regulatory requirements.
AI systems inventory, algorithmic risk map, findings matrix with severity, executive report for leadership, prioritized remediation backlog, and AIMS maturity roadmap.
Yes. The audit can function as a gap assessment: it evaluates the current state against ISO/IEC 42001 requirements and generates a risk-prioritized implementation roadmap.
The preliminary diagnosis takes 72 business hours. A complete AI governance audit ranges between 3 and 8 weeks depending on scope, number of systems, and organizational maturity.
If your organization needs to assess its AI systems against ISO/IEC 42001, this is the channel to discuss scope and methodology. All inquiries are handled under confidentiality.
The consulting and implementation services described on this site are provided independently. Certification audits and decisions are the exclusive responsibility of accredited certification bodies. In accordance with ISO/IEC 17021-1 §5.2, impartiality restrictions and cooling-off periods apply.