The Digital Futures Commission has now concluded. Our new project is the Digital Futures for Children centre, joint with 5Rights Foundation and London School of Economics and Political Science.

When are commercial practices exploitative? Ensuring child rights prevail in a digital world

By Ayça Atabey, Kruakae Pothong, Sonia Livingstone.

Almost every digital interaction is an exchange. Often this is underpinned by an invisible transaction in which the currency is data rather than cash. How do we distinguish between benign commercial practices and commercial exploitation in the digital environment? From a child rights perspective, why do we need to draw the line?

Clearly, businesses operate on a commercial basis, and this can include a commercial relationship with children and commercial uses of their data. But some practices are surely exploitative – examples include kidfluencers, games, lootboxes, dark patterns and risky designs. In the Digital Futures Commission’s Guidance for innovators workstream, we are exploring ways to embed children’s rights and best interests in designing and developing digital products and services.

The UN Convention on the Rights of the Child (UNCRC) calls for children to be protected from economic exploitation (Article 32) among other forms of exploitation (Article 36). But in different contexts, different laws may apply. This can complicate finding drawing the line between commercial ‘use’ and ‘exploitation.’

Image by Elena Darmetl from Pexels

What is commercial (or economic) exploitation?

Exploitation is using something or someone (unfairly) for your own advantage (Cambridge English Dictionary). The UN Committee on the Rights of the Child explains exploitation as “taking unjust advantage of another for one’s own advantage or benefit.” 

General Comment 25, the authoritative UN guidance for implementing children’s rights in a digital world, addresses economic and commercial exploitation:

“Children should be protected from all forms of exploitation prejudicial to any aspects of their welfare in relation to the digital environment… By creating and sharing content, children may be economic actors in the digital environment, which may result in their exploitation.” (para 112)

“Standards for digital educational technologies should ensure that the use of those technologies is ethical and appropriate for educational purposes and does not expose children to violence, discrimination, misuse of their personal data, commercial exploitation or other infringements of their rights.” (para 103, emphasis added)

Prioritising commercial interests over children’s interests alone doesn’t make the practice ‘exploitative’ but taking ‘unfair’ or unjust’ advantage of children or their data does. Such an advantage can be anything of value (not only monetary value or profit). For example, users’ feedback can have value for a business to improve its services and so can be an advantage. We explore below what can make such an advantage unfair or unjust.

How does commercial exploitation manifest in the digital world?

Business models (e.g. freemiums) affect design choices, and design choices affect children’s rights. Our consultation with children shows that they are sensitive to commercial exploitation and want to be heard on how the digital world should be designed.

Key policy documents call for limits on commercial exploitation in digital contexts, as illustrated below.

UNICEF Children’s Rights and Business in a Digital World (2019)Stealth advertising, such as product placement, sponsorship of streamers or use of other influencers, excessive data collection for profiling child consumers, and forms of in-app purchases.
OECD (2021)Risk of being excluded or discriminated against, or likely to suffer a future bias, because of … the way in which services are designed.
OECD (2016)Deceptive practices related to the collection and use of consumers’ personal data;Permitting others acting on businesses’ behalf to engage in deceptive, misleading, fraudulent or unfair practices.
OECD (2012)Embedded ads; privacy-invasive practices; age-inappropriate content (e.g., for age-restricted products such as alcohol); exploitation of incredulity and inexperience resulting in economic risks (e.g., overspending, online fraud) and other potentially non-financial risks (e.g. identity theft);exploitation of their personal data may result in false credit records.
Council of Europe Guidelines to respect, protect and fulfil the rights of the child in the digital environment (2018)Exposure to age-inappropriate forms of advertising and marketing, or unfair commercial practices.
Deceived by design (Norwegian Consumer Council, 2018)Failure to make privacy rights-enabling design – “Discouraging” users from exercising their rights to privacy nudges that may be against the user’s own interest.
Problems with data governance in UK schools: the cases of Google Classroom and ClassDojo (2022)Commercial exploitation through the abuse of power, using children’s data for developing new products, marketing, and advertising unfairly, in a way that children don’t reasonably expect or know about what happens to their data.
CMA Online choice architecture: How digital design can harm competition and consumers (2022)Hiding crucial information, setting default choices that don’t reflect users’ preferences, or exploit attention being drawn to scarce products.

Commercial exploitation is generally prohibited under consumer protection, advertising and gaming laws and frameworks. Examples include the UK Age Appropriate Design Code (AADC) (Standard 5 “Detrimental use of data”) and the Office of Fair Trading’s Principles for online and app-based games.

But Van der Hof et al. argue that current laws aren’t adequate to protect children against novel forms of commercial exploitation (e.g., eSports) and “specific measures against these forms of economic exploitation of children in the digital world are urgently needed.” We highlight two common features of commercial exploitation: unfairness and power imbalance.

‘Unfairness’

The European Data Protection Board’s Guidelines on data protection by design and by default give non-exploitation as a ‘design’ element for fairness, stating that “The controller should not exploit the needs or vulnerabilities of data subjects.” They identify diverse forms of ‘unfair’ practice (like misleading information, deception and manipulation), adding that failure to deploy reasonable safeguards to prevent unfair outcomes, including omission or ‘negligent’ acts, can constitute commercial exploitation. Advertising to children without considering vulnerabilities is an unfair practice that amounts to commercial exploitation.

Importantly, ‘unfairness’ doesn’t mean that actual detriment/harm has occurred. A loss of opportunity (for the child) or imbalance of benefits gained (children vs. companies) can be ‘unfair,’ making a practice exploitative. Further, transparency doesn’t make a commercial use ‘fair’ in and of itself, if it is against children’s best interests.

‘Power imbalances’ between children and businesses

In the context of children’s rights, commercial exploitation usually involves a ‘power imbalance’ between businesses and children – for instance, when a child isn’t given an alternative or has no way to understand how the business impacts their rights.

Children’s vulnerabilities (e.g., learning disabilities, mental health problems, age-based vulnerabilities, gender-related vulnerabilities) can increase the power imbalance. Other vulnerability factors can include a lack of knowledge and position to negotiate the terms of ‘exchange’, bargaining power, or access to remedies.

How can businesses uphold children’s right to protection against commercial exploitation?

General Comment 25 sets out some parameters for this:

  • There should be no “use of a child’s personal information or location to target potentially harmful commercially driven content” (para 40)
  • Also, no “profiling or targeting of children of any age for commercial purposes on the basis of a digital record” (para 42).
  • Nor should businesses “prioritize paid content with a commercial or political motivation over children’s choices or at the cost of children’s right to information” (para 53).
  • Nor should they “target children using [data-driven] or other techniques designed to prioritize commercial interests over those of the child” (para 110).

Given the multiple issues involved, General Comment 25 urges States to “require the business sector to undertake child rights due diligence, in particular to carry out child rights impact assessments and disclose them to the public, with special consideration given to the differentiated and, at times, severe impacts of the digital environment on children” (para 38).

Other general dos and don’ts include:

  1. Make children’s best interests your primary consideration by ensuring that your business interests do not trump children’s best interests.
  2. Comply with data protection, privacy, consumer protection and other relevant laws and standards, considering that different laws intersect and are considered together in today’s digital world. (Examples include FTC Epic game, Italy decision and Facebook)
  3. Facilitate children’s exercising of rights in relation to your digital product or service (e.g. through features such as a moderation system, and help chat function for Q&A).
  4. Don’t discriminate! Make sure that everybody can use and feel welcomed when using your product or service.
  5. Give children meaningful control over their data and don’t pressure or nudge them in ways they don’t understand or cannot avoid. 
  6. Don’t lock them into certain options in ways that interfere with their agency and choices by exploiting emotions and trying to influence their behaviour. For example, avoid confirmshaming (guilting and discouraging users to prevent them from choosing certain options) by showing messages like “Please don’t go!” when users try to unsubscribe.) Also, avoid using triggering statements like ‘JOIN NOW’, ‘Members are going to be super popular’ and ‘DOSH Top Up’ which ASA found to be putting pressure on children to buy membership subscriptions and in-game ‘currency’.
  7. Design your service/product in ways that anticipate and mitigate risk for different (vulnerable) groups of children – you can consult them on this! (See the Equality Act 2010 and AADC).
  8. Ensure value is created for them in their interaction with your product/service and that there is a fair exchange and mutual and proportionate benefits derived from that exchange.
  9. Ensure your relationship with children is fair and that they are aware of the risks associated with your commercial practices; tell them how they can seek help if needed, ask for help, or exercise their rights.

This list might look like a tall order, and businesses may not know where to begin. Worry not! The Digital Futures Commission is putting together Child Rights by Design – guidance for digital innovators to protect and respect children’s rights and stay on the right side of the law. We’ll launch it April 2023.