Information commissioner’s office (ico) called for a gdpr certification scheme, forhumanity answered the call.


ForHumanity designed this certification scheme for data processors, controllers and joint controllers (Auditees) of any size.  It may be applied to one or more specific data controllers/joint-controllers/processors including, but not limited to, all algorithmic systems, artificial intelligence, or autonomous systems when Personal Data is involved. It can cover any system, data controller or data processing instance where Personal Data is present. AI systems and autonomous systems that do NOT process Personal Data fall out of scope for this certification scheme as well as those exceptions noted in the DPA 2018. Auditors may provide the scheme, as agreed by contract, for one or more data processing systems.  Please see Independent Audit of AI Systems USER GUIDE v1.2 for more details on auditors, auditees and the creation and management of the audit rules and certification schemes.


The Auditee determines the system(s) to which the Auditor will apply the scheme and memorialises this agreement via contract. The Target of Evaluation (ToE) shall be defined by a contract between the auditor and the organisation (auditee).  The Auditee shall contract for any or all data controller/joint-controller/processor ToEs that are in scope, including, but not limited to algorithmic systems, artificial intelligence or autonomous systems that include Personal Data. 


The Auditor will only perform an audit to the breadth and scope upon the ToE determined by the contract with the Auditee.  The Auditee bears the responsibility of ensuring that all necessary systems undergo an Audit.  The contract must specify one of the following:

  1. For an individual system certification — The Auditee shall identify the ToE by name/identifier, specifically noting the boundaries of the systems.  Associated systems or adjacent processing must be clearly delineated as “in scope” or “out of scope” by the contract and have defined beginnings and endings. In scope functions can include Data Processors acting on behalf of a data controller. 
  2. Multiple named systems, with the same criteria as #1 for each system or the sum of the combined systems, where the Auditee must delineate the “beginnings and ends”.  


A firm can demonstrate compliance with UK GDPR through a UK GDPR Certification scheme.  Accordingly, only firms engaging in the processing of Personal Data will engage in the rigorous process of achieving a UK GDPR Certification. 


There are certain audit criteria that apply only to Data Controllers. As a result, an organisation that functions solely as a Data Processor will be unable to comply with audit criteria specified for Data Controllers. This will not prevent them from achieving certification.  Data Processors will not need to comply with criteria that applies only to Data Controllers and marked “Controller-only”.  A Data Controller or Joint-Controller must be compliant with all criteria for certification.


Data controllers/joint-controllers/processors that include the Personal Data of Children in their systems may be in scope for this UK GDPR certification scheme, insofar as, Article 8 has compliance criteria related to consent.  However, with the passage of the Age Appropriate Design Code (Children’s Code) ForHumanity’s Children’s Code Certification (FH3C) Scheme, we anticipate that most, if not all, Data controllers/joint-controllers/processors, who operate with Children’s data in the UK or related to UK Data Subjects who are children, will need to comply with this certification scheme AND FH3C.

Accountability and Trust leads to Governance and Oversight

A critical aspect of ForHumanity’s certification scheme is robust governance, oversight and accountability at the Officer and Board of Directors level to ensure trust.  It is not sufficient to require “the organisation” to comply with aspects of this certification scheme. Simply requiring an organisation to comply with certification criteria without defining specific accountability in governance structures creates ambiguity regarding delegation of authority and/or segregation of duties. Further, organisational compliance alone does not eliminate conflicts of interest, and its general nature could offer ‘cover’ to persons designing, developing and maintaining the data controlling/joint-controlling/processing system without the requisite experience to to satisfy all  criteria.  The same ambiguity is often the excuse offered when non-compliance occurs and blame is shifted.  Clarity of responsibility and accountability are hallmarks of ForHumanity certification schemes.


The certification scheme requires the establishment of an Algorithmic Risk Committee, Ethics Committee and Children’s Data Oversight committees to ensure conflict free, objective decision-making.  This approach mitigates against any individual gaps in expertise or training.. Examples of expertise include: 1) awareness of instances of Ethical Choice 2) discipline to execute Ethical Choice 3) awareness of the special needs of children 4) bias mitigation techniques including for cognitive biases 5) establishment of Key Performance Indicators to manage Concept Drift.  This design  deals with the expanse of skills needed to effectively manage risk in data controller/joint-controller/data processing works as well as acting as a check on the analysis, outputs and/or decisions rendered including algorithmic systems, artificial intelligence or autonomous systems.


The Board endorses the formation of these committees and makes them culpable for systemic failure in the organisation, which  enhances the likelihood of robust systems of compliance.  Liability at the CEO level, where each of these committees report, ensures responsible structural compliance across the entire organisation. The result is a culture of compliance – from design to decommission – with clearly delineated responsibilities, reporting lines and final accountability that increases transparency and advances governance.


The most tangible expression of a changed culture can be measured in resource allocation.  Resource allocation (time, investment, expertise) is a strong signal to both internal and external stakeholders that a culture of governance, oversight and accountability is valued by the Board and Officers.  The committee structure, a proven model of risk management, further ensures that checks and balances exist when these critical decisions and reviews are executed. Committees are responsible for having trained members and appropriate documentation to meet the demands of the audit criteria.


Our criteria also suggests a Data Control Committee (DCC), in line with the aforementioned benefits of committees to increase accountability and mitigate risk associated with a single point of failure and excess burden of responsibility.  While this scheme recognises that the UK GDPR calls for a Data Protection Officer (DPO), we encourage a higher form of governance, represented by the DCC led by the DPO in order to comply with the law.  The designation of a DPO must be assigned to an individual and that individual may be a Chief Data Officer.

Controllability of AI and Autonomous Systems

ForHumanity audit criteria is designed to ensure human agency is embedded in all algorithmic systems, Artificial Intelligence and Autonomous systems.  Control is an important building block to trust.  Any system that is out of human control will never earn trust and should be considered a systemic risk and decommissioned.


Specifically, in the context of GDPR, entities and operators need to be able to control a system in order to prevent it from processing data, or to ensure that it deletes personal data as requested.  Any degree of autonomy that does not support this, cannot be certified.