Independent Audit of AI Systems

Building an Infrastructure of Trust for all autonomous systems that impact humans in the areas of Ethics, Bias, Privacy, Trust and Cybersecurity.


IAAIS is designed to build an infrastructure of trust into all artificial intelligence and autonomous systems which impact humans in the areas of Ethics, Bias, Privacy, Trust, and Cybersecurity. We build audit rules to codify law and establish best practices that mitigate risks to humans. 

Proportionality and a risk-based approach is built into the process, allowing small and medium enterprises (SMEs), and other lower-risk processors to benefit from a risk adjusted approach to compliance. This ensures excessive compliance requirements don’t prevent access to the market. However, as the law applies to a company, so will the audit rules.


AAIS aims to cover all algorithms, artificial intelligence, autonomous systems, and data processing in the public and private sector which impact humans in any of the following areas: Ethics, Bias, Privacy, Trust, and Cybersecurity. The audit rules are applicable to both public and private entities but the comprehensive process is targeted at publicly traded companies. It is understood that many public sector applications may be sourced from the private sector.

IAAIS has compliance elements that command management across the whole organisation. While individual audits may be done on singular systems, certain compliance elements are equally applicable to a single system audit as to an exchange-listed entity.

Code of Conduct

Read our Code of Conduct.

ForHumanity is a mission-driven 501 (c) 3 registered Public Charity. We are developing Independent Audit of AI Systems through a crowd-sourced, iterated and collaborative process. That requires collegial behavior and a reasonable code of conduct.

Any person participating in the process may be excluded, subject to appeal to the Executive Director and/or Board of Directors, at the sole discretion of the Executive Director.

ForHumanity Fellows exist to guide discussions and help contributors integrate into the community. The Fellows are volunteers who have demonstrated an exceptional commitment to furthering the efforts of Independent Audit of AI Systems.

Unbecoming behavior of a Contributor includes:
o Failure to deliver on promised action in a timely manner.

o Slurs of gender, ability, race, sexual-orientation, national affiliation or any other identity or Protected Category.

o Inappropriate or offensive language or images.

o Failure to yield – there are a lot of people who may want to be included in the process, participants must be considerate of their peers.

o Failure to be considerate – Anyone found to be mocking or demeaning to employees of ForHumanity, Fellows or Contributors.

Fellows and Contributors agree to donate their time, efforts and editorial comments to ForHumanity, unless otherwise contracted. ForHumanity will use that Intellectual Property to further the mission of Independent Audit of AI Systems and maintain the audits once established. This includes licensing the Audit rules and standards to auditors and other entities engaged in the business of satisfying audit compliance through consulting, systems and technology.


Audit Rules


IAAIS Audit Rules have the following characteristics: 

ForHumanity Audit Rules

These characteristics prove vital for a variety of reasons. Ambiguous audit criteria only encourage auditors to take a more risk-averse approach and presume noncompliance when faced with non-binary choices. Good audit rules must provide the auditor with binary criteria such that certain elements are either compliant or not compliant. The Auditor remains liable for the final report which will either certify compliance or indicate noncompliance. No entity can be certified by an Auditor as partially compliant.

All of these rules must be implementable. Industry can feed into the creation of the rules to ensure that these rules can be followed. In fact, these rules will likely be built into the systems over time for compliance-by-design

Understanding SHALL/SHOULD/MAY

SHALL – is a MUST, a requirement. There is no compliance without sufficient satisfaction with the requirements of the criterion. A criterion is a SHALL because it is a legal requirement, a regulatory requirement, or a non-negotiable imperative for the protection of an individual, management/mitigation of a risk to individuals and has been determined feasible to comply. Strictly from a risk perspective, failure to comply with a SHALL criterion absolutely and unequivocally exposes the organisation to risk.

SHOULD – is a recommendation. It is within the power and judgment of an organisation to decideifitwillcomplyornot. However,SHOULDidentifiestherecommendedoption,therefore, if the organisation makes the choice to not comply, it must recognize and acknowledge that a risk is present and has been accepted. Therefore, audit compliance for a SHOULD statement can take one of two forms. Either documented compliance with the SHOULD statement or documented acceptance of the risk taken and “why” the risk was tolerable and non-compliance with the criterion accepted. Strictly from a risk perspective, the choice to not comply with a SHOULD statement likely exposes the organisation to risk, but the organisation may determine the subsequent risk to be tolerable, unlikely to occur, or mitigated in some other fashion. This subsequent risk assessment should be documented.

MAY – is a choice without prejudice to the options. It has been determined that compliance or non-compliance with the criterion by itself is neither positive nor negative for humanity inherently. MAY statements will often lead to documented risks that will lead to further compliance requirements based upon the choice. MAY statements exist to clarify for the organisation that it does, in fact, have a choice. For audit compliance purposes, the target of evaluation should document the choice it makes. This documentation should also reflect the pros and cons of the choice. Audit compliance is satisfied by this documentation. The choice made in response to a MAY question does NOT mean there is no inherent risk. Both choices likely have risks associated with them, therefore regardless of the choice, the organisation will need to manage risks stemming from that choice.


Anyone may join ForHumanity. When someone joins ForHumanity they are given access to the Audit Rules for critiquing and commenting. The Audit rules may not be used by an individual for commercial use without a proper license.

Any member of the ForHumanity community may comment or critique word-by-word, any audit question, definition, audit backup, or explanatory note. The crowd may also suggest additional

audit questions via comment or critique. ForHumanity will consider all suggestions and make changes on an as-needed basis until the audit is “locked”. See below for an explanation of a “locked” audit.

Comments and critiques are registered publicly and individuals are encouraged to seek support for improvements that they believe will benefit humanity.