The Digital Futures Commission aims to put children’s best interests at the centre of the design of the digital world. To inform our work, we keep a lookout for regulatory trends, good practice and the latest research internationally. The CNIL (French Data Protection Authority) held its first Privacy Research Day International Conference on 28 June, 2022, connecting researchers and regulators to promote multistakeholder discussion among legal experts, computer scientists, designers and social scientists. These lively discussions covered the following topics:
- The economy and privacy, covering business models and commercial interests in data as an economic resource.
- Smartphones and apps, focusing on consent mechanisms and tracking practices.
- User perspectives and perceptions of privacy and data rights, including how these have changed as a result of the GDPR.
- AI and explanation, considering optional performance of machine learning algorithms while proposing indicators like ethics and explainability to evaluate them.
- Organisational challenges regarding emerging digital technologies and innovative designs that are lawful and privacy respecting.
- Innovative tools for Data Protection Authorities to identify risks and personal data breaches.
We found five research studies especially thought-provoking for our work:
1 – Privacy, Data and Competition: The Case of Apps for Young Children. The research found that company size can affect the privacy protections offered to children. An empirical analysis of apps targeted at young children showed that larger companies protect children’s privacy more than small companies do. Further, self-certification programs can help in reducing data collection from children. This leads us to think that regulatory efforts should not be restricted to larger companies.
2 – A Fait Accompli? An Empirical Study into the Absence of Consent to Third-Party Tracking in Android Apps. This study analysed consent rules for tracking in a set of Google Play apps, finding that their practices often violate EU and UK privacy laws. Yet if software intermediaries like Facebook made even simple improvements to their privacy options, users’ privacy would be better protected. The research identified Apple’s iOS 1.45 as an instance of good practice because it standardises the process of obtaining user consent and makes data collection practices more transparent to users. We agree with the researchers that stringent guidelines for implementing consent to tracking in program code are needed for browser developers, device manufacturers and platform gatekeepers.
3 – The Price to Play: a Privacy Analysis of Free and Paid Games in the Android Ecosystem. This research found that user privacy in games is not very different depending on their monetization strategy. Interestingly, paying for a game doesn’t necessarily protect users from data collection. These findings suggest that, while attention to overt, covert or deceptive monetisation strategies is needed from regulators, especially to protect child users, attention is also needed to the overall business models of digital game companies, to ensure children are not commercially exploited.
4 – DPMF: A Modelling Framework for Data Protection by Design (DPbD). The authors identified the translation of legal principles and the requirements of Article 25(1) into technical operation of the digital products and services as a design problem which, if poorly handled, can undermine the accuracy of Data Protection Impact Assessment. Since Article 25(1) requires both technical and legal expertise, this research adopts an interdisciplinary approach and brings technical and legal perspectives/views together to ensure legal requirements are accounted for in digital products and services.
5 – Web Tracking Under the New Data Protection Law: Design Potentials at the Intersection of Jurisprudence and Human Computer Interaction (HCI). The research suggests that an autonomous agent system is a promising technical way of embedding legal requirements in design. The system could learn users’ privacy preferences and adapt them to other digital products and services they use across different contexts of their daily life, sparing users from recurring demands to set their privacy preferences every time they interact with a digital product or service.
In the Digital Futures Commission’s work, we have already considered the issues raised by freemium business models, data protection and design in building the rights-respecting digital world that children deserve. A major message delivered at the conference was the need to conduct interdisciplinary research with multistakeholder collaboration to bridge the potential gap that exists between what is legally required and technically possible.
Aligned with the research findings and methods presented at CNIL’s conference, at the Digital Futures Commission, we put interdisciplinary research and multistakeholder collaboration at the heart of our work. We run consultations and workshops with technical, legal, and design experts, professionals working with children across different sectors and disciplines, and of course, always listen to the voices of the children themselves.
This blog is part of the Guidance for Innovators series. You can view all our blogs here.