The Digital Futures Commission has now concluded. Our new project is the Digital Futures for Children centre, joint with 5Rights Foundation and London School of Economics and Political Science.

Navigating education data governance in the UK state schools: A continued conversation

By Emma Day

The Digital Futures Commission report on education data governance set out a child rights-based analysis of the use of EdTech in UK state schools. The report revealed a highly complex governance landscape with education governance devolved to the different UK nations, and data governance is overseen centrally primarily by the Information Commissioner’s Office (ICO) whose expertise is not in education. There is no government body that oversees both education and data governance together, and partly as a consequence of this, the law and policy applicable to EdTech is fragmented with unclear lines of accountability. Our report sets out key recommendations for bringing better clarity, accountability, and child rights protections to the EdTech space in the UK.

We launched the report at a webinar on July 1st when Sonia Livingstone and I engaged in a lively debate on education data governance and beneficial uses of education data with Bill Thompson, principal research engineer for BBC Research and Design and Jacob Ohrvik-Stott, acting head of domestic regulatory strategy at the ICO. We have answered as many questions as we could from the audience live at this event and are continuing the conversation on this sizzling topic below.

Van Tay Media on Unsplash
  1. The report has a strong focus on the DfE, and not the use of EdTech in devolved governments. To what extent does the report reflect the other Home Nations?         

It’s true that this report focused on the DfE. This is partly due to a necessity to limit the scope of our focus to begin with. The education governance landscape in the UK is already very complex due to the devolution of responsibility for education to the different nations. On top of this, the governance of data involves several different areas of law and policy. This required us to narrow down the focus to one nation, to begin with for clarity of the legal analysis which involves law and policy on both education (which is nation-specific) and data (which applies UK-wide). However, the UK GDPR and the DPA 2018 both apply to all nations in the UK, and so much of the analysis is still applicable to Scotland and Wales. It would be very interesting to see future deep dives into education data governance in the other nations and to see a comparative report assessing the differences between the nations. However, this is beyond the scope of our current study.

  1. At the ICO conference in April, it was clear from the breakout meetings that there was no clarification on whether Ed Tech was going to have comply with the Children’s Code – has a definitive answer been given on this?         

We understand that the ICO is planning to shortly publish a clarification about when and where the Age Appropriate Design Code (AADC) applies, and that they are likely to conclude that it doesn’t apply to schools where the school is the Data Controller, whereas it does apply to EdTech companies where the EdTech company is the Data Controller. What we would like to see is much more clarity about when EdTech companies should be considered independent data controllers and more detail about the circumstances under which they can rely on the legal basis of legitimate interests or contract to process children’s education data.

  1. FERPA in the US has similar provisions to the proposed code with regards to using data only for educational purposes and that’s been around for some time… much research has shown that this provision leaves grey areas that can be – and is – exploited by EdTech providers. How are these loopholes addressed in the code/the report?         

The report raises exactly this point as an issue that needs government attention. We say that ‘educational purposes’ should be more clearly defined by the government, and that there should be oversight over the evidence of educational benefits provided by different EdTech tools. Under the UK GDPR processing of children’s data must also be ‘necessary’ to meet a defined educational purpose, and we need to see much more objective evidence of ‘necessity’ as part of school data protection impact assessments (DPIAs). Even better than this would be for the government to vet EdTech products at a national level so that the burden is not on schools to make these kinds of assessments.

  1. Is there a case for stopping the use of EdTech (or aspects of EdTech) until appropriate regulations are in place?         

There may be. At the moment nobody – including the government – has an overview of what different EdTech products are being used in schools in the UK, and of how children’s data are being processed. Rather than simply stopping the use of EdTech, I’d rather see the ICO launch an investigation into the EdTech sector, similar to the investigation they previously carried out into data brokers so that we can shine some light on the detail of what is happening in the sector and where the regulatory changes are needed. Ideally, the ICO would work with the DfE to do this.

  1. Many, perhaps most, of the EdTech companies working with UK schools also operate in global markets. As we know from GDPR, data privacy regulations tend to have extraterritorial reach. What can the panellists say about the implications of the report’s findings and recommendations for education systems in other countries, especially where privacy regulations and their enforcement might be weaker?         

Our report highlights the lack of regulation and enforcement in relation to EdTech currently, especially in England. However, we also note that the DfE data strategy aims to position the UK as a leader in the global EdTech market. As was highlighted by the UNICEF Manifesto on Good Data Governance for Children (which I also worked on this year), companies operating across multiple jurisdictions should apply the highest data protection and child rights protection standards to their services everywhere and must act responsibly and according to child rights principles even when this is not strictly required by law.  

  1. Thank you so much for the amazing contributions. I have a quick question about using the notion of Ed-Tech as umbrella term. Do you think we need to start differentiating between high-risk and low-risk data collection in education? I am particularly concerned about the introduction of more AI based tech in schools, such as facial and emotion recognition, proctoring and other profiling machines.

We agree that there are different risks associated with different kinds of technology used in education. As part of our next steps for the education data governance workstream, we are going to be looking at developing child rights-based principles or guidance that can be used when completing a DPIA for EdTech products for use in schools.  We plan to include in this consideration of how different kinds of risks such as those associated with AI should be assessed and mitigated.

Our work on education data governance has begun with the report, highlighting the challenges in realising the full potential of education data usage and recommendations for how these problems could be addressed. Next, we will test and refine our recommendations and see how they apply to some of the EdTech tools used in UK schools.

You can view the rest of the blog series here

Emma Day is a human rights lawyer specialising in children’s rights and technology. Emma also works as a consultant for the UNICEF East Asia and Pacific Regional Office. Emma has previously worked for several NGOs and UN agencies on a range of human rights issues in Eastern Africa, Asia, Canada, and the UK. She holds an LLM in international human rights law from the University of London (2006) and an LLM in Law & Technology from UC Berkeley (2020). Emma is an Affiliate at the Berkman Klein Center for Internet & Society, and an Edmund Hillary Fellow.