By Sarah Turner
The Digital Futures Commission launched its report entitled Education Data Reality in its breakfast briefing on the 29th June 2022. The launch featured an introduction to the Education Data work stream of the Digital Future Commission by Sonia Livingstone, a presentation of the key findings by Sarah Turner, the lead author and a response from Al Kingsley, CEO of NetSupport and Chair of two Multi-Academy Trusts. The event concluded with a vibrant Q&A session, leaving many interesting questions for further discussion.
The report, Education Data Reality, explores the relationship that schools in the UK have with educational technology (EdTech): how they use it, where they value it – or find it problematic – and what they understand about how such systems use personal data. It reveals a complex ecosystem of technologies, all with very different purposes, collecting various types of data about students.
The report highlights four priorities for change, with detailed recommendations for government, technology providers and schools themselves:
- Schools expect and want support in choosing EdTech solutions that are effective, rights-respecting and safe.
- Schools expect EdTech providers to act in children’s best interests in their use of data – and these should be held to account by DfE and ICO.
- Schools need more support to ensure the infrastructure, resources and staff experience in school to manage children’s education data properly.
- Students need equitable technology access at home to facilitate their education as society moves increasingly towards a reliance on online learning.
Al Kingsley noted his general agreement with the report’s findings, including that data protection can be challenging for schools to manage. He emphasised that one size does not fit all when it comes to the most appropriate course of action for EdTech businesses, just as one size does not fit all when it comes to EdTech solutions for schools.
The briefing was well attended by highly engaged participants, raising many interesting questions. We answer a few of these below.
What is the role of parents? Aren’t they ultimately responsible?
Yes and no. Of course, it would be hoped that parents take an interest in and provide feedback in relation to the general running of the school. In data protection terms, however, schools do not require parental consent (or student consent, if the student is old enough) when they are acting to provide an education. Couple this with the lack of knowledge or opportunities for learning about data protection on the part of many parent, and it becomes very difficult to hold parents responsible.
If we can’t rely on government to provide effective guidance for data protection, who can we reasonably expect to do so?
In the UK, the Information Commissioner’s Office (ICO) has published significant documentation explaining to EdTech firms how to adopt the Age Appropriate Design Code (AADC). Data Protection Officers interviewed for the report found the ICO helpful when approached directly on data protection issues. It remains to be seen what the ICO’s remit may be in the future, given the plans for modernisation under the recently announced Data Reform Bill, but one must hope that useful guidance will continue to be provided. However, the report highlights that schools want greater engagement from the Department for Education (DfE) to navigate complex decision-making around safe, rights-respecting and suitable technology procurement.
Should the Chartered College of Teaching have a role in supporting schools around best understanding the use of data, and the responsibilities schools have?
Academic research suggests that those going through teacher training today – not to mention those trained in earlier times – do not get the support they need to understand and reflect upon digital technology use and the role of personal data. But if teachers are not empowered to understand how personal data can be used, they cannot pass this knowledge on to their students. At worst, they will model practices that do not require children to engage with decisions about how their data are processed. More needs to be done to incorporate personal data processing and its consequences into initial teacher training and ongoing learning for educators.
How do we manage the tidal wave of Chromebooks and similar devices that have been provided to schools since the beginning of the pandemic? These devices were accepted when schools were in crisis mode, without much thought as to the longer-term safe use.
This is an important question. In the UK, DfE provided many laptops to students needing them, through their schools, during the pandemic. From our interviewees, we learned that support is now being removed and these laptops will no longer receive remote updates, creating a cyber security concern. This may put children’s data at risk – as well as provide a route into wider school systems. We know from the DfE’s recent EdTech report that cyber security is low on schools’ concerns in terms of technology spend.
Do we need a special EdTech watchdog?
It is tempting to say yes to this, but it is hard to know where such a body would comfortably sit, and with what authority. What schools seem to be asking for is trustworthy guidance on how to access technology that has been shown to be valuable while not having to worry that they are compromising their students’ data in the process. Furthermore, as Al Kingsley mentioned at the breakfast briefing, with such a heterogeneous school system, it is hard to know how a watchdog would avoid having a one size fits all approach.
The government and the regulator (ICO) as the primary duty bearers for realising children’s and data subject rights are responsible for fulfilling the requirements of some 30,000 schools individually. As Sonia Livingstone observed, it creates a ‘David and Goliath’ problem to task each of them with negotiating complex contracts with major EdTech providers that safeguard children’s rights and their best interests. The report supports the voices of schools in calling on the DfE to conduct and publish its own Data Protection Impact Assessments (DPIAs) or other forms of data rights and risk assessment on the EdTech platforms recommended or funded for use in the UK schools.
At the same time, the ICO should be given sufficient resources to investigate and clarify the applicable scope of AADC, and apply the Code accordingly. In combination, this would stand a chance of ensuring that schools contract only with rights-respecting EdTech in future.
The Digital Futures Commission is committed to facilitating the usage of data processed from children in education in ways that are proportionate to the evidenced benefits, safe and child-rights respecting. We will release our collection of emerging child-rights respecting alternatives to the processing and usage of education data later in September 2022.