The Digital Futures Commission has now concluded. Our new project is the Digital Futures for Children centre, joint with 5Rights Foundation and London School of Economics and Political Science.

The education data governance vacuum: why it matters and what to do about it

By Emma Day

What are the necessary steps required to secure the future of children’s data-driven learning? Baroness Beeban Kidron OBE and Professor Sonia Livingstone OBE chaired a panel to discuss answers to this question with Jacob Ohrvik-Stott, acting head of domestic regulatory strategy at the Information Commissioner’s Office (ICO), Bill Thompson, principal research engineer for BBC Research and Design and myself – Emma Day, the author of Governance of data for children’s learning in the UK state schools report.

The report marks the Digital Futures Commission’s first step towards a pathway for rights-respecting and beneficial use of education data, unpacking the use of EdTech in UK State schools, how this is currently regulated, and who is responsible for protecting children’s rights in the education sector.

Why focus on education data?

We know data is collected about children all the time, even before they were born. But particular to the education setting, children are a captive audience with little choice over the learning platforms or apps they use to access lessons or complete homework. Children can rarely opt-out from EdTech services, especially during prolonged periods of remote learning in the wake of Covid19.

The EdTech sector is estimated to be worth around £3.4 billion in the UK, and the main value of EdTech is in the data companies process for product testing and development. Yet the government appears to be failing in its role as duty bearer for children’s rights, allowing the EdTech sector to operate in a governance vacuum.

What is a child rights-based approach to education data?

From a child rights perspective, the national government is the overall duty bearer, responsible for ensuring that all the rights that children have under the UN Convention on the Rights of the Child, the Human Rights Act 1989, the Equality Act 2010, and UK GDPR are respected, promoted and fully implemented in accordance with the rule of law, including in relation to the digital environment.

As part of the government, the Information Commissioner’s Office is also a duty bearer, responsible for protecting children’s data rights in particular, while the Department for Education is a duty bearer responsible for protecting children’s right to education; as public authorities, state-funded schools are also duty bearers. EdTech companies also have responsibilities to respect, protect and remedy children’s rights following the UN Guiding Principles of Business and Human Rights and the Children’s Rights and Business Principles.

Key ingredients of the rule of law are that laws must be certain, clear and known; there must be equality before the law, and respect for children’s rights; plus, access to a remedy where children’s rights are violated. We found that these key ingredients are not currently in place for the EdTech sector in the UK. The law governing EdTech is murky, with disagreement among experts, for example, over how the GDPR applies to schools and EdTech companies. There is little enforceable law or policy governing the EdTech sector, leaving the industry largely to regulate itself with little oversight or transparency.

There is an urgent need for the government to fulfil its duty to protect children’s rights through clear and enforceable regulation of the EdTech sector in the UK. Our report lays out the key governance challenges for the ICO and the Department for Education (DfE) as government duty bearers and offers alternatives for addressing these challenges over short and medium terms.

Focusing on EdTech for teaching, learning and assessment (“Learning EdTech”), the immediate steps for government include:

  1. Develop ICO guidance on how the UK GDPR and the DPA 2018 apply to the education data processed by Learning EdTech companies in schools
  2. Review the procedures for accessing National School Data from the DfE and ensure strict adherence to the Five Safes framework as required by the Digital Economy Act 2017.
  3. Produce DfE guidance for EdTech companies, grounded in an independent evidence base and setting out the criteria of educational purposes that Learning EdTech should fulfil.
  4. Direct BESA’s LendED library to develop an alternative to product ratings, based on formal evidence rather than anecdotal opinions – this is vital since what works in one context may not work in another.
  5. Develop mandatory rules for schools’ procurement of Learning EdTech to ensure credible improvements in teaching and learning, and compliance with data protection regulation.
  6. Create joint oversight mechanisms giving both the DfE and the ICO formal roles in ensuring Learning EdTech’s compliance with the law.
  7. Direct the DfE’s Schools Commercial Team to develop specific rules for schools’ procurement of Learning EdTech services, including those offered ‘free of charge’, to ensure children’s rights are respected.
  8. Create standard contractual clauses for use by learning EdTech companies in relation to data processing (ICO) and standard commercial clauses for pricing (DfE).
  9. Develop standard contractual clauses for contracts between schools and Learning EdTech companies (ICO), detailing the kinds of data that can be processed from children under legitimate interests lawful basis.
  10. Encourage compliance with the best international standards on data protection and child rights to increase the UK’s competitiveness in the global education marketplace.

At the launch, it was encouraging to hear from Jacob Ohrvik-Stott that the ICO recognises the problems in the report. Focusing on processes and systemic intervention, he promised new work from the ICO in the coming weeks, including on how the Age Appropriate Design Code (AADC) applies to schools and EdTech. We learned that, while the AADC does not apply to schools, it sets a high bar to which schools should aspire. However, Jacob Ohrvik-Stott stressed that the GDPR does still apply to schools, adding that the AADC articulates how the principles of the GDPR apply to children. The idea of procurement rules for EdTech tools used by schools was seen as compelling by the ICO.

The recommendations were well received by the panellists, giving us hope that our recommended immediate steps will result in definite improvements. We look forward to hearing more from the DfE and the Office for National Statistics.

Data sharing for the public interest

The Digital Future Commission’s interest lies in identifying beneficial uses of education data. One of our medium-term recommendations invites exploration of how companies can best provide data to the government for use in the public interest. Bill Thompson noted synergies with BBC R&D work on new models of data management – for example, creating a personal data store that gives users more power over their own data so they know what data is held about them and have control over what it is used for. The idea is that transparency from data controllers, coupled with agency for data subjects, encourages better behaviour; can this become a model for a public service data ecosystem?

There is a lot going on in this fast-moving space. Panellists agreed that clear and robust governance of children’s education data would help EdTech companies to innovate and minimise their risks whilst doing so, as well as being in the best interests of children in the UK.

You can view the rest of the blog series here.

Emma Day is a human rights lawyer specialising in children’s rights and technology. Emma also works as a consultant for the UNICEF East Asia and Pacific Regional Office. Emma has previously worked for several NGOs and UN agencies on a range of human rights issues in Eastern Africa, Asia, Canada, and the UK. She holds an LLM in international human rights law from the University of London (2006) and an LLM in Law & Technology from UC Berkeley (2020). Emma is an Affiliate at the Berkman Klein Center for Internet & Society, and an Edmund Hillary Fellow.