By Louise Hooper, Kruakae Pothong, Sonia Livingstone
The Digital Futures Commission launched its report on Problems with Data Governance in UK Schools: the cases of Google Classroom and ClassDojo on 31st August 2022, chaired by Baroness Beeban Kidron OBE, Founder and Chair of 5Rights. After an outline of the problems posed by data-driven digital technologies in school, by Professor Sonia Livingstone OBE, lead author Louise Hooper, barrister at Garden Court Chambers, set out the report’s key findings following a socio-legal analysis of two widely-used EdTech products.
Further highlights from the launch include a response from Michael Veale, Associate Professor in Digital Rights and Regulation in the Faculty of Laws at UCL and a lively Q&A with webinar participants which opened up new questions about the future of digital classrooms.
Watch the launch here.
The key findings highlight four key problems with the use of EdTech in British schools:
- It is nearly impossible to discover what data is collected by EdTech: Governance policies are often spread across multiple documents, use inconsistent terms and require multiple ‘clicks’ to discover the full extent of data collection.
- EdTech profits from children’s data while they learn: EdTech blurs the boundary between core and additional services and encourages children from more private into more commercial environments, usually without highlighting the safety, privacy and rights consequences.
- EdTech Privacy policies and/or legal terms do not comply with data protection regulation: The complexity of EdTech platforms and apps leads to a lack of transparency likely to be in breach of UK GDPR and invalidating any consent for data processing or data transfer to the USA.
- Regulation gives schools the responsibility but not the power to control EdTech data processing. In several cases, EdTech’s contract with a school describes the school as the data controller even though they lack the power and technical knowledge of the product to direct the data processing.
Subsidised by the government, schools played a key role in the huge expansion of the EdTech market during the COVID-19 lockdown and continue to do. So, the report argues, government – and the ICO – should take a lead on finding the solution to these problems, mitigating the risks posed by EdTech to children’s education, privacy and other rights.
Michael Veale welcomed the report, noting that schools are becoming reliant on a few large companies, who are having a real, tangible impact on pedagogy and what school is, does and could be. He remarked that the involvement of schools in the business of education must be critically examined before schools and teachers lose control of teaching!
Webinar participants came from many countries, and the Q&A broadened the discussion by pointing to the need for global action regarding the dominance of a few large EdTech providers. Some of the questions asked in the Q&A couldn’t be answered live, so we answered them here.
How do we ensure that teachers are not made scapegoats for problems with data?
There was a real concern that teachers would be blamed for any consequences of the use of technologies in the classroom, including the scraping and commercial uses of children’s data. Tech companies tend to put responsibilities for ensuring children’s rights and privacy onto the shoulders of parents or teachers. We argue that a basic framework for privacy-protecting contracts and standard terms should be negotiated by the government, on behalf of schools and families, so as to right the power imbalance currently skewed in favour of the tech companies. Teachers require better tools, guidance and proper resourcing to ensure that EdTech benefits children’s education and does not harm their privacy and other rights.
Can data collected from children in education be used in a more responsible way?
This question generated a discussion about anonymising data for public purposes, also recognising concerns about possible reidentification. As Michael Veale noted, simply removing children’s names and identifiers is insufficient, because telemetry data such as click patterns, the ways children type and the websites they visit can enable reidentification. So, we argue that a strict adherence to the principles of data minimisation and purpose specification is vital and must be embedded throughout the design and development of any data systems. Similarly, consideration should always be given to whether the gathering of such data is necessary and proportionate, considered against the benefit the technology provides. Determining what is in children’s best interests is [L1] not a tick-box exercise; it requires a careful balancing act, bringing children’s best interests to at least the same par as the interests of other stakeholders involved. In this way, we can ensure more evenly distributed benefits of education data use for all and in a responsible manner.
What more could be done internationally?
Real concerns were raised by participants about the relative lack of attention to children in policies and practices regarding education data and EdTech outside high-income countries. The challenges faced by small nations, and those in the global South, are especially pressing, leaving us to mull over the importance of international cooperation and sharing best practices. It seems plausible that data collected from or about children will be used in ways that are not in children’s best interests unless either national data protection measures are robust or international cooperation is strengthened. There are also concerns that the dataset and analytic models that form the basis of influential EdTech products are unrepresentative of global communities. Although the report we launched centres on the UK, these concerns are valid and warrant greater international efforts.
What about the future?
For the future there are a series of known unknowns and unknown unknowns that require careful consideration for children in schools.
Firstly, new technologies such as the metaverse, alternative, extended and virtual reality create risks to children that are not yet assessed, arguably calling for the precautionary principle. Secondly, the use of data in large language models, algorithms trained to recognise associations between words and phrases, to develop profiles of children and the development of behaviour profiling and other inferential models create risks of discrimination and social scoring, both of which have potential to interfere with children’s rights and futures. Cumulatively, new technologies and data collection have the ability to change the nature and direction of education and teaching and place this in the hands of commercial entities rather than schools and teachers.
The report notes emerging developments in relation to EdTech and calls on the government to use the Data Reform Bill as an opportunity to provide clear, accessible and relevant child rights-respecting regulation. We believe this would lead to a true pro-innovation approach enabling companies operating in the UK to maximise the benefits of data processed from children in educational contexts for all and with minimised risks to children’s safety, privacy and life prospects. This could facilitate an attractive data regime founded on children’s best interests and thus trustworthy.
The Digital Futures Commission is now working on a blueprint for the beneficial uses of children’s educational data. We invite you to continue the conversation with us by signing up for our mailing list here