The Digital Futures Commission has now concluded. Our new project is the Digital Futures for Children centre, joint with 5Rights Foundation and London School of Economics and Political Science.

Innovating in children’s best interests for a ‘fair’ digital world

By Ayça Atabey

The Digital Futures Commission aims to make children’s best interests a primary consideration in the design of the digital environment. We keep a lookout for good practices and guidelines to help digital innovators embed children’s best interests in their products and services. The Age Appropriate Design Code (the Code) is the first statutory Code of Practice for children’s data protection. Matching the Code’s child rights focus are UNICEF’s Manifesto on Good Data Governance for Children and Policy Guidance on AI for children. Common to all three is the concept of ‘fairness.’ But what is meant by fairness in today’s digital world, and why does it matter?

All 15 standards of the Code reflect data protection principles set out under the UK General Data Protection Regulation (UK GDPR), particularly the fairness principle. This principle should lie at the heart of all processing activities involving children’s data, because the standards provide

“practical measures and safeguards to ensure processing under the GDPR can be considered ‘fair’ in the context of online risks to children.”

Exploring the child rights implications of the fairness principle has become more urgent with the COVID-19 pandemic, following a dramatic increase in the use of AI-driven technologies making children’s lives ever more digital by default. Although use of AI-driven technologies creates many opportunities for children, the increasing use of algorithmic analytics and data collection also creates significant risks. Notably, there have been considerable concerns over AI and discrimination, since

Predictive analytics can amplify existing discrimination and bias. Artificial Intelligence is increasingly used to make critical decisions for children, such as allocation of welfare benefits or where schools should be built. When these systems use biased data sets, discrimination can result.”

(UNICEF Manifesto on Good Data Governance for Children)
Image by Impact Photography from Shutterstock

Important as this is, the fairness principle in data protection law has a broader scope than ‘non-discrimination’. This is even when data processing is not discriminatory it might still be ‘unfair’ by not prioritising children’s best interests. For example, commercially exploitative data processing activities linked to adverse effects on children, or processing data in ways children wouldn’t ‘reasonably expect’, might not be ‘discriminatory’ but they would still infringe the fairness principle and children’s best interests. For a prevalent example, consider EdTech where, as Hillman notes

Data collection and algorithmic modeling propel user profiling and control in ways students and even their teachers may not be aware of and understand.

UNICEF’s Policy Guidance on AI for children also underscores the importance of ‘prioritising fairness and non-discrimination’ for children,’ requiring that

 “since there is no one optimal technical definition of fairness to prevent bias, developers need to consider the trade-off of multiple fairness definitions.”

How can developers do this? The Code and the recently-published IEEE Standards for Age Appropriate Design Digital Services Framework provide good guidance for calculating this trade-off.

Fairness and transparency

The fairness principle is linked to other principles such as ‘transparency’. As the Information Commissioner’s Office notes, “on a wider level transparency is also intrinsic to the fairness element of Article 5(1).” With children’s best interests in mind, the child-centred fairness and transparency rules mean that organisations must reflect the needs of different groups of children when communicating information to them. This approach also promotes accessibility and inclusiveness, which are key if allchildren are to benefit from data-driven technologies equally.

Again, compliance with transparency rules alone is insufficient to guarantee that data processing is fair. Here, it is important to note that data protection principles apply cumulatively, meaning that the violation of any one of them causes organisations to be in breach of the UK GDPR even if they have demonstrated compliance in all other areas. Therefore, organisations must comply with the fairness principle. The Code is a great way to achieve this as it explicitly states that online services should follow the Code to help them “process children’s data fairly”.

What does ‘fairness’ look like in digital world?

The fairness principle of the UK GDPR and the child’s best interests standard of the Code are aligned with UNICEF’s Policy Guidance on AI for children and UNICEF Manifesto. The Guidance on AI for children requires digital innovators to focus on fairness and non-discrimination principles while the Manifesto calls on innovators to prioritise children’s best interests, bearing in mind children’s evolving capacities, their diverse identities and circumstances.

Aligned with UNICEF’s inclusive approach, the Code promotes data protection by design and by default through a children’s best interests lens. Data protection by design requires embedding privacy and data protection principles (including the fairness principle) into the design of data processing activities and business practices. This can be achieved in several ways, including but not limited to giving information to children in a straightforward way they can understand and avoiding deceptive/manipulative language and design at all levels. This is highly relevant to the Digital Futures Commission’s work on creating frameworks and resources for digital designers.

The Digital Futures Commission offers digital innovators a holistic approach through the lens of (child) rights to re-think and re-design the digital environment in two key parts of children’s lives – play and education. For example, digital innovators can deploy Children’s Rights Impact Assessment (CRIA) as a tool to put children’s best interests at the heart of the design of the digital environment. To embed children’s rights in data-driven education systems, the Digital Futures Commission has also explored ways to bridge data governance gaps so that data processed from children in educational contexts respects children’s rights. To enhance children’s playful opportunities, we offer Playful by Design principles, underpinned by academic research and informed by children’s voices. We are now workshopping these with game designers, before integrating our different workstreams to develop a comprehensive and accessible innovators’ toolkit grounded in children’s rights and accompanied by resources and practical steps to create digital products and services in children’s best interests for a ‘fair’ digital world that children deserve. So, stay tuned!

This blog is part of the Guidance for Innovators series. You can view all our blogs here.