KS OWNER – Alice Rangel Teixeira & Sundar Narayanan
KS DEPUTY –
LAST UPDATED – May, 2023
Committee Governance Assessment
The Committee Governance Assessment (CGA) is an overarching review of all interactions of risk management and compliance functions to ensure that there are no gaps or silos in terms of risk mitigation and regulatory requirements. This assessment examines the interaction between committees to ensure that each one receives all necessary knowledge and outputs required for their own compliance and the overall compliance with Independent Audit of AI Systems criteria. The Committee Governance Assessment is the primary oversight, it reviews the second line of defence and it is conducted by the third line of defence in the ForHumanity Risk Management framework.
ForHumanity’s Risk Management Framework covers Governance, Risk Management and Compliance with Ethics, Bias, Privacy, Trust and Cybersecurity as key pillars.
This Framework and process has two primary objectives:
Maximising the mitigation of risk
Fairly, accurately and transparently displaying Residual Risk for natural persons impacted by AI, algorithmic and autonomous systems.
AI, algorithmic, and autonomous (AAA) systems are socio-technical systems that present many risk categories beyond traditional enterprise ones. These systems impact individuals, society, and the environment in different ways, creating unique, specialised, and multidisciplinary challenges.
These challenges are addressed in the operational process of Risk Management by duly designated teams of experts who are trained in understanding specific and multidisciplinary risks. Each duly designated team forms a committee with a specific role and expertise.
The operational process is localised in the Functional Risk Management of ForHumanity’s Risk Management Framework, and presents three layers of defence to maximise Risk Mitigation. The Committee Governance is the third and the second-to-last layer of defence, coming before the Internal Audit.
The Committee Governance Assessment (CGA) is designed to examine the interrelationship between committees, experts, specialty committees, front-line design-development-data science teams for AI, Algorithmic and Autonomous (AAA) Systems. This report is done by the Operational Risk Management, who are able to review and have oversight on the Committees that are formed, and is included in the comprehensive AI Risk Evaluation (cAIRE) report that collects all outputs from the aforementioned analyses and Residual Risks.
Each of the committees and associated risk assessments is insular, meaning they are self-contained within the governance and oversight structures. Under this arrangement, by design, there is limited risk of missed assignments or miscommunications on accountability and responsibility. However, interrelationships in multi-disciplinary, socio-technical systems like AAA systems are unavoidable. The CGA is designed to examine, analyse, track and record the interfaces between committees.
A critical objective of the CGA is to fill all gaps and eliminate miscommunications in the management of risk for AAA systems. To achieve this, the report covers the following:
Logging all duties, responsibilities and accountabilities for a specific AAA System and keeping it current
Tracking all Duty Designation Letters
Tracking all specialty committees
Identifying gaps, inconsistencies or issues associated with the alignment to the mandates for the specific committees/ duty designation letters.
Identifying all audit criteria that transit from one committee or duly designated officer to another committee or duly designated officer
Identifying cross communications, sharing of risk inputs, consultations with specific committees including Ethics Committee or Children’s Data Oversight Committee and gaps that exists in such communications or interactions
Third line of defence in consultation with the Ethics Committee to consider Inadequacies associated with the diversity or inclusion of adequate or appropriately skilled resources in the committees
Validating the existence of an adequate process to ensure that all risk treatments and Residual Risks, regardless of the source risk assessments, are logged, deployed and collected for the cAIRE report.
Identifying Residual Risk per organisation procedures, and then, if duly designated examining and analysing external risk treatment options, and deploying mitigations documented with Traceability.
Process gaps, inefficiencies, delays in remedial actions / residual risk management on part of committees if any arising from the process.
Validating the existence of an adequate procedure for adjudicating interactions and potential conflicts between two or more specialty committees (e.g., Algorithmic Risk Committee and the Ethics Committee)
Linked Knowledge Stores and Content
Linked Knowledge Stores and Content