Age Assurance

We welcome all feedback and recommendations for improvement.

ICO Opinion on Age Assurance (10/2021)

On 14 October 2021, the Information Commissioner’s Office (ICO) issued a Code Opinion regarding Standard 3 on Age-Appropriate Application specifically relating to Age Assurance for in scope ISS and Age Assurance Providers (collectively referred to as “Organisations”) of products, services and applications that covered ISS may use to comply with the Code. 

This Opinion aims to provide the Commissioner’s current view relating to three (3) key challenges expressed by stakeholders arising from the operationalisation of Age Assurance:

  1. the levels of risk arising from different types of data processing and the commensurate level of age certainty required to identify Child Service Users and mitigate the risks;

  2. the level of certainty that various age assurance solutions provide, and confirmation of which providers or types of solutions comply with data protection requirements; and

  3. how to collect the additional personal data required for age assurance while complying with the data minimisation principle.

The ICO currently expects ISS to meet the Code’s Age-Appropriate Application standard—including conformance to the seven (7) principles of the UK GDPR—and outlines an approach for the application of Age Assurance that is appropriate and proportionate to Organisational context and use of Children’s personal data.

Clarity is provided on children’s risk levels and risk criteria:

  • HIGH RISK arises from ISS processing activities of personal data beyond the delivery of its core services which are likely place the rights of freedoms of children at high risk. 

  • MEDIUM or LOW RISK arises from ISS processing activities of personal data beyond the delivery of its core services which are likely place the rights of freedoms of children at medium or low risk. 

The Commissioner expects continued organisational maturity in the adoption and conformance of Age-Appropriate Application of the Code, including:

  • evidence and record assessment risks and decisions taken, including on the age-appropriate application standard;  

  • ensuring accountability for decisions taken;

  • enabling organisations to demonstrate their approach, even if it is evolving; and

  • to create evidence the ICO can consider against a complaint or attention of the ICO.

An advantage of utilising appropriately UKAS certified Age Assurance Providers is Age Assurance determination without having to collect additional personal data solely for this purpose. For example, the Age Check Certification Scheme (ACCS) provides an independent check against current industry standard, PAS 1296:2018.

There is little evidence of the effectiveness and accuracy of current Age Estimation techniques and products, most of which generally use Artificial Intelligence (AI) algorithms to automate the interpretation of data. Often coupled with the use of biometric data, voice analysis, hand geometry, natural language processing (NLP), behavioural analysis and profiling, ISS using Age Estimation are likely to be classified as HIGH RISK at this nascent stage. 

Continued market stimulation for the development of innovative Age Assurance methods and Age Estimation techniques underpinned by Data Protection by Design and Default (DPbDD) principles are planned—as is approval of certification schemes to facilitate Code compliance.

Additional Age Assurance guidance is provided to 1) facilitate Organisational determination of  applicability (Annex 1), 2) inform organisations on current and emerging approaches (Annex 2), and 3) provide a summary of expected economic costs and benefits (Annex 3). 

The Commissioner is scheduled to review the Code in September 2022 and will continue to engage with key stakeholders, including Ofcom, who has assumed the role of regulator for video sharing platforms (VSPs) and the future regulator of online safety

Information about emerging Age Assurance techniques and Age Estimation approaches that evidence the preservation of intended ISS user experience (UX), while facilitating optimal solutions and protections for children are welcome and encouraged.

Although this IAAIS Supplemental Guidance is limited to the Code, this Opinion relates more broadly to:  

A final note—

“Action will be taken in the event that personal data is misused under the guise of or during processing for Age Assurance.” (p. 9)

Key Definitions

AAA System is any artificial intelligence (AI), algorithmic, autonomous, or hybrid system thereof defined as the Target of Evaluation (ToE) referenced in the contract between the ISS and the Board Certified Independent Auditor of AI Systems.  

Account Confirmation is a process that relies on the account holder, typically a parent, carer or legal guardian, to confirm that a Service User is over or under 18, or the age of the user. The ISS can then provide the user with an age-appropriate version of the ISS application.

Age is the period of time someone has been alive, or something has existed.

Age-Appropriate Application is a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to Child Service Users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or apply the standards in this code to all of your users instead.

Age Assurance refers collectively to approaches used to provide Assurance that children are unable to access adult, harmful or otherwise inappropriate content when using ISS; and estimate or establish the age of the Service User so that ISS can be tailored to their needs and protections appropriate to their Age. In some cases, Age Guidance or Age Restrictions may be lower than 18, however, Age Assurance can still be applied.

Age Assurance Providers are third-party providers of Age Assurance products, services and applications that ISS may use to confirm with the CC.

Age Certainty is a method of establishing age in a manner appropriate and proportionate to the risks that arise from your data processing. CC is not prescriptive about exactly what methods you should use, nor what level of certainty different methods provide

Age Estimation (AE) is a process that establishes a Service User is likely to be of a certain age, fall within an age range, is over or under a certain age. AE methods include automated analysis of behavioural and environmental data; comparing the way a Service User interacts with a device or with other Service Users of the same age; metrics derived from motion analysis; or testing the user’s capacity or knowledge.

Age Gating is a technical measure used to restrict or block access of Service User to ISS who do not meet its Age Guidance or legal Age Restrictions

Age Guidance is a classification process and system of audiovisual content advice to help children and families choose what’s right for them and avoid what’s not. Developed by the British Board of Film Classification (BBFC), certain age guidance also includes Age Restrictions. 

Age-Verification (AV) is a process of determining a person’s Age by checking against trusted, verifiable records of data, particularly when a high level of certainty is required.

Assurance is a positive declaration or promise intended to give confidence. In this context, the word ‘assurance’ refers to the varying levels of certainty that different ISS offer in establishing an age or age range.

Child or Young Person means every human being below the age of eighteen (18) years unless under the law applicable to the child, majority is attained earlier.

Children’s Code Harms Framework is a tool developed by the ICO to enable ISS reflection, consideration and future action of risks and harms that sit at the nexus of service design, children’s data processing, and legal standards within scope of the CC.

Child Service Users are children who use the ISS.

Connected Toys and Devices are physical products that are connected by Internet protocol(s) or network connectivity and often considered part of the Internet of Things (IoT). If you provide a connected toy or device ensure you include effective tools to enable conformance to this code.

Data Protection by Design and Default (DPbDD) is a legal requirement under UK GDPR that underscores the fundamental data protection principles of data minimization and purpose limitation. Data protection by design considers privacy and data protection issues at the design phase of any system, service, product or process and then throughout the lifecycle. Data protection by default limits the processing of data to that which is necessary to achieve the specific purpose of the ISS. DPbDD incorporates the seven (7) foundational principles expressed in PrivacybyDesign (PbD).

Data Protection Impact Assessment (DPIA) is a defined process to help you identify and minimise the data protection risks of your service – and in particular the specific risks to children who are likely to access your service which arise from your processing of their personal data.

Estimation is a guess or calculation about the cost, size, value or extent of something. 

Hard Identifiers in this context are ways in which you can confirm age using solutions which link back to formal identify documents or ‘hard identifiers’ such as a passport

Identification (ID) in this context is the process of establishing the identity of a Service User and likely to include some form of AV.

Independent Audit of AI Systems (IAAIS) is a process whereby an outside third party, e.g.  Licensed ForHumanity Certified Auditor (FHCA), is charged with review of the AAA System deployed by the ISS to validate compliance against the subject certification scheme, e.g. Children’s Code. The Independent Auditor’s charge is one of public interest, whereby the IAAIS in conducted in an objective, robust, binary and uninfluenced manner. 

Information Society Services (ISS) means any ‘service’, that is to say, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.

Self-Declaration occurs when a Service User indicates their age without credible evidence. 

Service Users are people who receive services provided in the carrying on of a regulated activity. Please note: The regulations refer to ‘service users’ and where we quote the regulation directly we use this phrase. Elsewhere in the guidance we have used the terms ‘people who use services’ or ‘people’.

Target of Evaluation (ToE) is the processing operation that is defined to be certified. 

Verification is the act of verifying something (proving or checking that it exists or is true or correct).

AADC Age Assurance Background

The AADC is a statutory data protection code of practice and applies to providers of Information Society Services (ISS) likely to be accessed by children. This code is designed to protect and promote the best interests of the child, a concept from Article 3 of the United Nations Convention on the Rights of the Child (UNCRC).

“In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration.”

ISS include, but are not limited to mobile apps, online games, social media platforms, educational websites, search engines, online messaging or Internet based voice telephony services, online marketplaces, content streaming services (e.g. video, music or gaming services), and other websites or tools related thereto.

The code contains fifteen (15) flexible standards of age-appropriate design to ensure ISS are designed with the protection of children in mind. In the application of the code, a risk-based approach is utilised to protect children as they learn, explore and play as Service Users of ISS to reduce or eliminate harm children may encounter during their digital user experience (UX).

The accountability and governance obligations of ISS under the GDPR make Data Protection by Design and Default (DPbDD) a legal requirement. The seven (7) foundational principles of PrivacybyDesign (PbD) underpin any approach the ISS may take and include:

  1. ‘Proactive not reactive; preventative not remedial’

  2. ‘Privacy as the default setting’           

  3. ‘Privacy embedded into design’

  4. ‘Full functionality – positive sum, not zero sum’

  5. ‘End-to-end security – full lifecycle protection’

  6. ‘Visibility and transparency – keep it open’

  7. ‘Respect for user privacy – keep it user-centric’

One suggestion for operationalizing DPbDD involves developing guidelines for pragmatic implementation framed by a Data Protection Impact Assessment (DPIA) (AADC Standard 2). This frame can also be used to develop policies and procedures across your organisation in order to meet the DPbDD expectations of the ICO. 

Standard 3: Age Appropriate Application requires ISS to deploy code appropriate to the age and developmental stages of children likely to access its services, including disabled children. 

To this end, in the next section we offer an AADC Age Assurance Framework to assist ISS in the operationalization of this frame so as to achieve the best interests of the child while complying with the AADC. 

Reminder! Children have the same rights as adults under UK GDPR. Even if a child is too young to understand the implications of ISS impact to their rights and freedoms, they are nevertheless entitled to their rights and freedoms.

For your convenience, Age Assurance key concepts and their relationship to one another are summarised in Figure 1. 

Figure 1: Four Components of Age Assurance—Age Confirmation, Age Verification, Age Certainty, and Age Estimation.

 

AADC Age Assurance Framework

The AADC Age Assurance Framework (Framework) is designed to be flexible, scalable and to fit with existing ISS approaches and processes to managing risks. It contains the key elements of the AADC, as of the date of this publication.

The scope of the Framework is limited to Age Assurance governance and accountability per the AADC. This framework assumes that Standard 1: Best Interests of the Child has already been performed with an outcome of a positive determination. An outcome of a negative determination is not supported by this documentation.

AADC Standard 3: Age Appropriate Application requires that ISS consider the age and developmental stage(s) of its Child Service Users, including consideration for the disabled, consent, risk and an appropriate level of age certainty.

This Framework operationalizes the DPIA (Standard 2) to provide ISS with a process of pragmatic implementation of AADC Age Assurance per the key priority areas in a five-step process, specifically:

  • Step One: Assessing ToE Risk Level

  • Step Two: Level of Age Certainty Required Based on ToE Risk Level

  • Step Three: Age Assurance Solutions by Level of Age Certainty

  • Step Four: Best Practices for Deployment of Age Assurance Solutions

  • Step Five: Independent Verification of Compliance

To ensure that ISS likely to be accessed by children for a given Age addresses the developmental needs of Child Service Users, credible evidence must be documented that 1) children are unable to access adult, harmful or otherwise inappropriate content and that, 2) an age-appropriate ISS application has been applied. 

Methods for Age Assurance

The ISS has two options for implementing age assurance:

  • Apply age assurance based on level of certainty proportionate to the risks in each age range and developmental stage; or

  • Apply the standards of the code to all users of the ISS uniformly, thereby meeting age assurance requirements by applying the highest standard of AADC required for all users. (Please note this may not be a panacea and may require the ISS to perform certain age-assurance requirements for compliance under AADC.

AADC application and overview provides further explanation and details to this method.

The levels of age assurance are:

  1. Age Verification (or Age Determination

  2. Age Estimation

  3. Account Confirmation

  4. Self-Declaration

Methodology for Framework Adaptation

The ICO recommends building an Age Assurance Accountability Programme with standards and controls that have been vetted and widely accepted in industry. In addition, it is the recommendation of the ICO that an independent verification of compliance be completed to ensure the best interests of the child.

This framework is adapted utilising the following guidance, accepted standards and controls:

  1. ICO Opinion on Age Assurance for the Children’s Code (10/2021); 

  2. AADC Data Protection Impact Assessment (DPIA); and

  3. Age Check Certification Scheme (ACCS), PAS 1296:2018 Code of Practice for Online Age Verification Service Providers Developed by British Standards Institutions (BSI) and the Digital Policy Alliance.

Performing an AADC DPIA 

The UK GDPR outlines seven key processes for the completion of an Age-Appropriate DPIA (Standard 2). Ideally, a DPIA should begin early in the system development process, in particular, prior to data processing activities. 

The DPIA should not be a one-and-done, rather, it is recommended that ISS return to the DPIA process on a periodic schedule, or prior to any change in purpose, features, data processing activities, or other modifications likely to present a risk to children not yet examined by in the DPIA process.

A generic AADC DPIA template is included in Appendix A. Specific use case DPIA templates are available for online retail (Appendix C), mobile gaming apps (Appendix D), and connected toys (Appendix E). Certain processing activities may signal high-risk system potential, triggering the DPIA requirement, see Appendix B for examples and guidance.

ISS may need to perform separate DPIAs for each of the five age ranges and developmental stages as defined by the AADC:

  • 0 – 5: pre-literate and early literacy

  • 6 – 9: core primary school years

  • 10-12: transition years

  • 13-15: early teens

  • 16-17: approaching adulthood

ALERTS

  • Organisations should evidence and record assessment of risks and decisions taken, including actions of applying the age-appropriate application standard in their DPIAs.

  • This will ensure accountability for the decisions taken and enable organisations to demonstrate their approach, even if it is evolving. 

  • This may serve as evidence that the ICO can consider if a complaint is brought about an ISS or a related violation comes to the ICO’s attention.

Determining Risk Level and Age Assurance Expectations

It should be noted that ISS must first determine risk in relation to children’s rights and freedoms, e.g. centering the child in context of their use of the ToE. 

The ICO Opinion clarifies that risk should then be considered based on processing activities of personal data beyond the delivery of core services which are likely to place the rights of freedoms of children at high risk. Figure 2 provides a high-level summary.

Figure 2: Guidance on Risks to Children’s Rights and Freedoms 

Children’s Risk Level

Risk Criteria

Age Assurance Expectations

High

ISS activities which are likely to result in high risk to children’s rights and freedoms.


Processing of personal data beyond core services are considered likely to place rights and freedoms of children at high risk.

If any high risks cannot be mitigated, then they should consult with the ICO prior to commencing the ISS activities, in line with Article 36 of the UK GDPR.


Organisations should either:

  1. Apply all relevant code standards to all users to ensure risk to children are mitigated; or

  2. Introduce age assurance measures that give the highest possible level of certainty on age of users.


This may take into account the products currently available in the market, and the potential risk to children.

Medium or Low

ISS activities which are likely to result in medium or low risks to children’s rights and freedoms.



Organisations should either:


  1. Apply all relevant code standards to all users to ensure risk to children are low; or

  2. Introduce age assurance measures that give a level of certainty on the age of child users that is proportionate to the potential risks to children.

Reproduced from Information Commissioner’s Opinion: Age Assurance for the Children’s Code (10/2021), p. 7.

Step One: Assessing ToE Risk Level

For each of the five age ranges and developmental stages of children listed in Figure 3, ISS must examine the likelihood of harm, the severity of harm, and overall potential for risk, with particular scrutiny to impact of Children’s rights and freedoms per outcome of DPIA(s). 

The Children’s Code Risk Assessment Toolkit provides a comprehensive guide for this examination, utilising the data protection risk statements contained in each of the fifteen (15) AADC standards, including the risk area, exemplar risk activities, and potential impact / harm to children. 

In addition, Figure 3 also serves as an overall risk matrix dashboard, providing ISS with a system-wide anticipated residual risk rating, a necessary prerequisite for Step Two: Level of Age Certainty Required Based on ToE Risk Level.

Figure 3: AADC Age Assurance Framework, Step One: Assessing ToE Risk Level for Each Age Range and Developmental Stage

Age Range & Developmental Stage

Type of potential harm(s), if any 

(Children’s Rights & Freedoms)

Likelihood of harm?
(low, medium or high)

Severity of harm?
(low, medium or high)

Mitigating Risk Factors (practical steps  to reduce harm)

Inherent Risk Rating
(low, medium or high)

Anticipated Residual Risk Rating
(after mitigation action(s))

Ages 0 – 5

pre-literate and early literacy

      

Ages 6 – 9

core primary school years

      

Ages 10-12

transition years

      

Ages 13-15

early teens

      

Ages 16-17

approaching adulthood

      

Adapted from IEEE Levels of Age Assurance, the ICO Guide to Data Protection: ICO Codes of Practice, Age appropriate design: a code of practice for online services, Annex B: Age and developmental stages definitions, and the Children’s Code Risk Assessment Toolkit

In addition, ISS must also consider risk in the context of its governance, oversight and accountability capacity so as to ensure that it:

  • Meets the child’s needs as they change over time; 

  • Views the AADC as an innovation challenge in addition to just a legal obligation; and

  • Incorporates AADC as part of its ongoing lifecycle, including R&D and new features.

 

Presently, there is not a one-size-fits-all rating system. However, the ICO provides additional guidance on identifying impact to children’s rights and quantifying magnitude of impact.

Step Two: Level of Age Certainty Required Based on ToE Risk Level

Building upon the residual risk rating for each age and developmental stage, as determined in Step One, the ISS should next identify the appropriate level of Age Certainty required for the ISS

Categories of Age Assurance by Level of Certainty Required

When determining the level of certainty required as a function of the anticipated residual risk of ISS, consider assessing solutions in the dimensions of:

  • Accuracy: Is the age determined by the verification process exact, or is it an estimation?  If it is an estimate, how wide is the margin for error?

  • Authenticity: Does the proof of age belong to the person who is claiming it?

  • Currency: How recently was the age verified? You may think – “but people don’t get any younger” – but outdated checks may have been conducted when technology was less accurate.

  • Reliability: Different sources of evidence offer varying levels of reliability – a passport may be close to 100% reliable (if it has been authenticated) while a students union card may offer less confidence. 

Based on this assessment, four (4) levels of Age Certainty are produced for each of the three risk categories (low, medium, or high):

  • Basic Age Certainty;

  • Standard Age Certainty;

  • Enhanced Age Certainty; and

  • Strict Age Certainty.

Each of these categories of Age Certainty are generally applied with a measure of frequency, for example, one-time or every-time (e.g. persistently) the ISS is accessed by the Child Service User.

Basic Age Certainty

Frequency: One time.
Level of Age Certainty: Age Confirmation 

Standard Age Certainty

Frequency: One time.
Level of Age Certainty: Simple Age Verification

Enhanced Age Certainty

Frequency: One time and persistent.
Level of Age Certainty: Hard Age Verification, Parental/Carer/Guardian Consent

Strict Age Certainty

Frequency: Persistent.
Level of Age Certainty: n/a*

*This category is not a stand-alone, rather, is meant to be deployed to harden other services Age Certainty services. 

An example of Strict Age Certainty enabling technology includes Multi-factor Authentication (MFA). MFA is a security enhancement that allows you to present two pieces of evidence – your credentials – when logging in to an account.  

The level of age certainty employed can be one-time, persistent, or consistent. Consistency in this context means that the ISS applies all relevant code standards to all users at all times to ensure risk to children remains low.

Age Estimation can be used to categorise Child Service Users into age range and developmental stage categories for further Age Assurance determination, when required. When higher levels of anticipated residual risk is estimated, Age Estimation must be further mitigated by adjusting the services’s accuracy in proportion to the mean average variance and mean absolute error.

Figure 4 provides a general guide and examples of solutions for each level of age certainty required based on ToE risk level for each age range and developmental stage of children as set forth by the ICO. 

Figure 4: AADC Age Assurance Framework, Step Two: Level of Age Certainty Required Based on ToE Risk Level

Age Range & Developmental Stage

Low Risk

Medium Risk

High Risk 

Enhanced Risk Mitigation

Ages 0 – 5

pre-literate and early literacy

Basic Age Certainty (One Time)


Age Confirmation


Age Estimation, as a standalone, is currently not recommended by ICO and forbidden by EDPB.

 

Standard Age Certainty (One Time)


Simple Age Verification


Age Estimation, as a standalone, is currently not recommended by ICO and forbidden by EDPB.

Enhanced Age Certainty 

(One Time or Persistent)


Hard Age Verification

Parental/Carer/Guardian Consent


Age Estimation, as a standalone, is currently not recommended by ICO and forbidden by EDPB.

Strict Age Certainty (Persistent)


Multi-factor authentication (MFA)

Ages 6 – 9

core primary school years

Ages 10-12

transition years

Ages 13-15

early teens

Ages 16-17

approaching adulthood

Alternatively, apply all relevant code standards to all users at all times to ensure risk to children are low.

Adapted from IEEE Levels of Age Assurance, the ICO Guide to Data Protection: ICO Codes of Practice, Age appropriate design: a code of practice for online services, Annex B: Age and developmental stages definitions, and the Children’s Code Risk Assessment Toolkit

ALERTS

  • When determining Level of Certainty and data protection requirements based on the anticipated residual risk rating of the ISS, it should be noted that in some context’s certain vulnerable groups, including children, may be among the estimated one billion people worldwide who don’t own a legally-recognised form of identification.

  • Weighting the anticipated residual risk rating of the ISS against the estimated residual risk of digital divide may be considered. In the UK, more than 3.5 million people do not hold any form of photo ID, according to Electoral Commission estimates.

  • In the UK, the Age Verification (AV) sector currently works to BSI PAS 1296:2018. The IEEE and ISO are both developing complimentary standards and controls for the AV industry. 

Step Three: Age Assurance Solutions by Level of Age Certainty

Once the appropriate level of Age Certainty has been determined for the ISS, a review and determination of solutions for this process must be conducted to identify those that are of best fit and conformity. Figure 5 provides a guiding matrix with exemplar solutions for each level of Age Certainty. 

According to the ICO, Age Assurance through a third-party service provider is a sound practice only if the ISS conducts proper due diligence, e.g. DPIA and Age Assurance risk assessment on the 3rd Party Service Provider. This due diligence must be conducted prior to engagement to ensure the third-party solution is appropriate and compliant with AADC. 

Age Check Certification Scheme (ACCS) standard PAS 1296:2018 is a Code of Practice for Online Age Verification service providers, developed by the British Standards Institute (BSI) and the Digital Policy Alliance. This is the current prevailing standard against which products and services are assessed for Age Assurance fit and conformity. 

Figure 5: AADC Age Assurance Framework, Step Three: Age Assurance Solutions by Level of Age Certainty

Basic Age Certainty (One Time)

Standard Age Certainty (One Time)

Enhanced Age Certainty 

(One Time or Persistent)

Strict Age Certainty (Persistent)



Age Confirmation


Self-declaration (age-gating): User entered birthdate to access the ISS. .


Technical design measures:  


Passive algorithmic or artificial intelligence (AI) verification: 


Other contraindications:





Simple Age Verification


Simple age verification: open question of user for their age.


Online age check systems: 


Active age estimation: 




Hard Age Verification


Parental/Carer/Guardian Consent


ID validation (hard identifiers): Use of government documentation, (e.g. drivers licence


Tokenised age checking using third parties: 


Liveness detection: 


Presentation attack: 



Multi-factor authentication (MFA)

Alternatively, apply all relevant code standards to all users at all times to ensure risk to children are low.

Adapted from IEEE Levels of Age Assurance, the ICO Guide to Data Protection: ICO Codes of Practice, Age appropriate design: a code of practice for online services, Annex B: Age and developmental stages definitions, and the Children’s Code Risk Assessment Toolkit

Technical design measures: 

Operationalizes the AADC technical guidelines to support implementation, including the adoption of controls by the ISS.  (e.g. preventing the user from immediately attempting to re-register if they are denied access on first declaration; or  closing the accounts of users discovered to be underage)

Passive algorithmic or artificial intelligence (AI) verification: 

The use of systems to aid in the verification process.

Other contraindications: 

Active age estimation: 

Online age check systems:

Tokenised age checking using third parties: 

A third party issued token, or artefact, that verifies a user has been authenticated.

Liveness detection:

Whereby the use of live video helps ensure the user in the session is real and their identity has not been manipulated.

Presentation attack: 

is the formal name given to anti-spoofing measures taken by Age Check Providers to prevent systems being tricked into giving false results. (e.g. Pseudo Identities, Mannequins, Masks, False Identity Documents, False Instruments, Tamper Evident Instruments, Genuine Instruments that have been amended, Disfigured Instruments)



As stated previously, Age Estimation, as a standalone, is currently not recommended by ICO (and forbidden by EDPB.)

Limitations of PAS 1296:2018

In a review of the PAS 1296:2018 standard and awarded Certificates of Conformity for Age Assurance, the authors note that as of the publishing date of this KB, all of the awarded certificates for Age Assurance by ACCS are based on the accurate estimation of age of persons age 18 or older.

ALERTS

  • Third-party due diligence is a prerequisite to meet the standards set forth in the Scheme in order to pass the IAAIS on a stand-alone basis.

  • Of the seven (7) third-party ACCS certifications, none provided sufficient evidence to substantiate that the solution certified is capable of providing reliable Age Assurance for Child Service Users under the age of 18. See Appendix G for more detailed information. 

Step Four: Best Practices for Deployment of Age Assurance Solutions

Should ISS resolve to perform Age Assurance, this step provides guidance to aid compliance with AADC. The purpose of an Age Assurance program is to balance the collection of additional personal data required for the service while complying with the Data Minimisation Principle of PbDD. The ISS should utilise PbDD throughout the design lifecycle of its Age Assurance service, including AADC recommendations for UNCRC.

AADC Recommendations

The ICO offers code recommendations to ensure ISS does not negatively impact children’s rights under the United Nations Convention on the Rights of the Child (UNCRC) as follows:

The seven (7) foundational principles of PrivacybyDesign (PbD) underpin any approach the ISS may take and include:

  1. ‘Proactive not reactive; preventative not remedial’

  2. ‘Privacy as the default setting’           

  3. ‘Privacy embedded into design’

  4. ‘Full functionality – positive sum, not zero sum’

  5. ‘End-to-end security – full lifecycle protection’

  6. ‘Visibility and transparency – keep it open’

  7. ‘Respect for user privacy – keep it user-centric’

ISS must also operationalize and conform to the ACCS 4:2020 Technical Requirements for Age Check Systems (which implement PAS 1296:2018) standard.

Step Five: Independent Verification of Compliance

After the operationalization of Age Assurance by the ISS, independent verification must be conducted to certify the ISS as compliant with the AADC under the Scheme.

Independent Audit of AI Systems (IAAIS) is a process whereby an outside third party, e.g.  Licensed ForHumanity Certified Auditor (FHCA), is charged with review of the AAA System deployed by the ISS to validate compliance against the subject certification scheme, e.g. UK Children’s Code. The Independent Auditor’s charge is one of public interest, whereby the IAAIS in conducted in a rigorous manner that is objective and uses binary criteria.

Other Frameworks Worth Considering

EU AI Act

The European Union Artificial Intelligence Act (EU AI Act) centres the ISS (system-centric) in risk assessment, whereas the UK Children’s Code centres the person, specifically the Child Service Users (user-centric) first and foremost as required by PbDD principles. 

The latter is not ISS dependent in it’s initial assessment criteria and therefore can apply to nearly any emerging technology or iterative technology as risk is measured based on impact to Children’s rights and freedoms. The former is concerned with risks as measured by ISS practices, techniques and approaches. 

Nevertheless, the standard set forth in the EU AI Act provides additional insight and considerations when assessing the ISS and arriving at a determination of anticipated residual risk level the ISS may produce. 

How is the EU AI Act useful in IAAIS against the UK Children’s Code? 

As supplementary information during Step One: Assessing ToE Risk Level, to arrive at a determination of appropriate level of AADC, specifically, Age Assurance

Figure 6 provides a secondary guide for ISS-centric risk determination, from low and minimal risk which requires no obligations of compliance to unacceptable level of risk that prohibits all AI practices. Those risk levels and associated ISS obligations are summarised in Figure 6 below.

Figure 6: European Union Artificial Intelligence Act (EU AI Act) Pyramid of Risk and AI Practices

Unacceptable Risk: Prohibited AI Practices 

  • AI systems that deploy harmful manipulative ‘subliminal techniques’;  

  • AI systems that exploit specific vulnerable groups (physical or mental disability);  

  • AI systems used by public authorities, or on their behalf, for social scoring purposes; and

  • ‘Real-time’ remote biometric identification systems in publicly accessible spaces for law enforcement purposes, except in a limited number of cases.

High Risk: Regulated High-Risk AI Systems

  • High-risk AI systems used as a safety component of a product or as a product falling under Union health and safety harmonisation legislation (e.g. toys, aviation, cars, medical devices, lifts). 

  • High-risk AI systems deployed in eight specific areas identified in Annex III, which the Commission would be empowered to update as necessary by way of a delegated act (Article 7):  

  • Biometric identification and categorisation of natural persons;  

  • Management and operation of critical infrastructure;  

  • Education and vocational training;  

  • Employment, worker management and access to self-employment;  

  • Access to and enjoyment of essential private services and public services and benefits;  

  • Law enforcement;  

  • Migration, asylum and border control management;

  • Administration of justice and democratic processes.

Limited risk: Transparency obligations 

The AI systems presenting ‘limited risk’,such as systems that interacts with humans (i.e. chatbots), emotion recognition systems, biometric categorisation systems, and AI systems that generate or manipulate image, audio or video content (i.e. deepfakes), would be subject to a limited set of transparency obligations (Title IV). 

Low or minimal risk: No obligations 

All other AI systems presenting only low or minimal risk could be developed and used in the EU without conforming to any additional legal obligations. However, the proposed AI act envisages the creation of codes of conduct to encourage providers of non-high-risk AI systems to voluntarily apply the mandatory requirements for high-risk AI systems (Title IX). 

ALERT

  • Whereas the ICO discourages the use of biometric identification systems (or biometric verification systems), the EU Commission prohibits their use.

ForHumanity Inclusion, Disability and Accessibility Audit Scheme

ALERT

The ICO requires assessment of risk related to disability, specifically citing the following under Article 23: Children with Disabilities,

This right is at risk where services do not provide privacy information and community standards in accessible formats. This is unfair to children with disabilities and does not follow UK equalities law.”  

 

However, the toolkits (?) and other exemplars do not contain a unit of analysis for this variable. For that reason, we suggest utilising the ForHumanity Inclusion, Disability and Accessibility audit criteria for this portion of the DPIA and risk analysis.

Age Assurance and the Scheme

This sections contains clarity for the Scheme per ICO’s Opinion of 14 October 2021 and industry best practices. In addition to the legal requirements of the Code, certain other recommendations are included for ISS desiring to do more to protect the best interests of the child.  

Please note that:

  • SHALL statements denote criteria that are legally required. 

  • SHOULD statements denote criteria that have been identified as a recommended best practices. 

  • MAY statements denote criteria that extend the criteria, without judgement or prejudice. 

Criteria, other than “Should” statements, exist to clarify for the ISS that it does, in fact, have a choice beyond the legal “floor” of compliance. It should be noted that the latter choices can lead to the documentation of risks which are likely to lead to further compliance requirements—they can also lead to better outcomes ForHumanity. 

High risk ISS with Service Users under the age of 18 must take a High-Privacy by Default approach to Age-Appropriate Design Code, unless a compelling reason for a different default setting can be demonstrated.

AADC Audit Alerts

Age Determination Using AI Facial Analysis Technology

  • The ICO permits, see FAQ note below, the use of AI facial analysis to triage the age of users. ForHumanity does NOT PERMIT THIS ON A STANDALONE BASIS for age verification precision e.g. where a jurisdiction states that it is illegal for an under 18 to purchase or access a certain good or service.  Given that there is a MAE (Mean Absolute Error) with this technology an age buffer is needed, there is insufficient validity, accuracy and reliability of AIs to precisely determine under 1.5 years of accuracy, so the difference between say the ages 12 and 13, 17 and 18, where that may be a vital concern. For users who are close to a crucial age threshold a platform shall offer additional methods for users where the AI has deemed that they are too young.  

  • AI (facial analysis) for Age verification MAY be very useful, on a standalone basis to triage users in certain circumstances and where young people do not own or have access to a government issued identity document or where bulk processing can pinpoint areas for further investigation or verification.  Further, AI (facial analysis) may be very useful in a bulk context to ensure that adults (e.g. Ages 21+) are excluded from an ISS UK (when desired by the ISSK) e.g.,to deter grooming. Accuracy is the greatest for ages 13-24 (to within 1.5 years of accuracy), but decreases for older ages as people look after themselves more or less well as they age. 

  • An FHCA SHALL ensure that any use of third-party verification details their methodology and precision studies as the basis of Age Assurance. This due diligence process, while to be done by the CDOC of the auditee, the FHCA is best 

ICO FAQ’s on the 15 Standards of the Children’s Code

What about using facial images to estimate age?

Using AI to estimate a user’s age from an image of their face may be, in principle, a reasonable way to establish age with a level of certainty appropriate to the risk. However, we recognise that  currently much of the work in this area is still in a research and development phase. This means there are few products commercially available. Therefore, if you use this technology you must ensure that it provides an appropriate level of certainty. The processing must also be compliant with the UK GDPR and the code where appropriate.