How To Ensure Children's' Data Protection During Product Design

How To Ensure Children Data Protection During Product Design

Updated: October 2025

Key Takeaways

• All online services likely accessed by UK children must follow 15 risk-based standards—like default high-privacy settings, data minimisation, clear child-friendly transparency, and bans on manipulative “nudge” designs—to put children’s best interests first.

• Effective compliance demands early DPIAs, robust multi-layered age-assurance, isolated children’s data pipelines, and ongoing audits to align with both ICO benchmarks and evolving statutory obligations.

Introduction

Nowhere is data protection more vital than in products aimed at or used by children. That’s why the age-appropriate design code was introduced, but what does it mean for product and service designers?

In 2020, the Information Commissioner’s Office (ICO) introduced a code of practice designed to protect children in the ever-more digital world. Affected companies should have been complying with the code since 2021.

Effective June 19, 2025, the Data (Use and Access) Act embeds the Code’s principles into UK GDPR Article 25(1), upgrading child-centric privacy by design into a binding legal requirement.

Meanwhile, from July 25, 2025, the Protection of Children Codes under the Online Safety Act 2023 impose complementary statutory safety duties, such as harm risk assessments and age-appropriate features, that work alongside the ICO’s Code.

What is the age-appropriate design code of practice?

The code aims to ensure that any organisation providing online services likely to be accessed by children in the UK, and using sensitive data, will take into account the best interests of the child while complying with the General Data Protection Regulation (GDPR) and the Privacy and Electronic Communications Regulations (PECR).

It has been designed to help companies develop services that provide the additional protections children need when using their personal data, while also ensuring they can still enjoy their digital experiences.

Who does the age-appropriate design code of practice apply to?

This code is for providers of information society services (ISS). The ICO defines ISS as “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.”

The electronic services in the scope of the ISS include apps, programs, websites, games or community environments, and connected toys or devices with or without a screen. The code applies to any ISS likely to be accessed by children in the UK. It’s important to note that this is a wider definition than ‘services aimed at children’.

An internet-enabled/connected cuddly toy, for example, will clearly fall within the scope of the code. But a phone app which enjoys widespread appeal among all demographics may also be caught if it is likely that children will form part of its audience.  

The code sets out 15 standards of age-appropriate design. The standards take a risk-based ‘default setting’ approach to ensuring children enjoy the best possible access to online services while minimising data collection and use. 

While the code sets out the standards to be achieved, it does not define how companies should implement them, leaving developers and creatives to find their own route to compliance.

1. Best interests of the child: This is the primary consideration, with what constitutes ‘best interests’ defined by the United Nations Convention on the Rights of the Child (UNCRC) and encompassing elements including:

“Needs for safety, health, wellbeing, family relationships, physical, psychological and emotional development, identity, freedom of expression, privacy and agency to form their own views and have them heard.”

2. Data protection impact assessments (DPIA): The point of a DPIA is to understand the data protection risks inherent in your service. Done early in the design phase, it’s a way to ‘lock in’ compliance, ensuring that the best interests of any likely child users are incorporated from the outset.

3. Age-appropriate application: Having identified that children may be likely users of your product, this stage is about understanding the ages of those children and ensuring that the data protection measures you put in place are appropriate to them. The assessment of the appropriateness of your measures should be based not just on the age groups you believe are likely to be using your product/service, but also on the certainty of that assessment.

If, for example, you ask users to self-certify their age, your confidence in their responses is likely to be low. This means you should apply the entire code to all users and assume access by any age. By incorporating third-party verification or an AI element, confidence can be higher, allowing you to apply the standards in a more age-appropriate way. 

4. Transparency: When dealing with children, transparency requires more than publishing terms and conditions. It means making that information clear and prominent, and delivering it in bite-sized, child-friendly chunks where it is relevant.

5. Detrimental use of data: In many ways, this standard is inherent in protecting the ‘best interests’ of children. It requires that the sensitive data of children should not be used in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice.

6. Policies and community standards: This standard requires that you adhere to your own published terms and policies, working on the basis that when you tell children that you will act a certain way, you should deliver on that promise.

The standard extends to community rules you set in your forums, chatrooms and similar, where you are expected to actively uphold and enforce your own rules.

7. Default settings: In general, any default settings used by in-scope products or services should be set to ‘high privacy’ unless there’s a compelling ‘best interests’ reason to do otherwise.

8. Data minimisation: This standard requires you to collect only that personal data needed to deliver the service elements used or engaged with by the child. You should offer the option to approve the use of additional data if they wish to engage more deeply or broadly with your services.

9. Data sharing: A fundamental element of the GDPR is a fundamental element of the age-appropriate design code too. You should not disclose the data of children unless, as the ICO notes, you can “demonstrate a compelling reason to do so, taking account of the best interests of the child”.

10. Geolocation: Subject to the same ‘compelling reason’ caveat seen at 9 above, geolocation options should be switched off. If location tracking becomes active because the child approves it, its activation should be obvious, and it should default back to ‘off’ at the end of the session.

11. Parental controls: If your online service allows a parent or carer to monitor their child’s online activity or track their location, you should give the child a clear and obvious signal when they are being monitored.

12. Profiling: Profiling options should be switched off unless there’s a compelling ‘best interests’ reason to do otherwise.

13. Nudge techniques: The code expressly warns against using so-called ‘nudge techniques’ to encourage children to part with more personal data than is necessary, or to weaken privacy protections. The ICO gives the example of a ‘Would you like to proceed’ scenario where the ‘yes’ response is bold, green and much larger than the ‘no’ option.

14. Connected toys and devices: Recognising that connected toys and devices, especially those without screens, may present compliance challenges for manufacturers, the code makes it clear that the responsibility remains to ensure that tools enabling conformance with the code are included.

15. Online tools: Tools that help children to exercise their data protection rights and report concerns should be made prominent and accessible.

Age appropriateness and the GDPR

The age-appropriate design code has been designed to ensure that compliance with it will support (although not guarantee) compliance with a number of articles within the GDPR relating to (among many other things):

• Fairness, lawfulness and transparency (Article 5(1)(a))
• The right to be informed (Articles 12, 13 and 14)
• The rights of data subjects (Articles 15 to 20); and
• Profiling and automated decision-making (Article 22)

You can find more about how the code supports data protection compliance here.

How to apply the code

As with any set of standards, the challenge inevitably comes in their application and their inter-relationships with other codes. Service design is challenging enough without having to carry the various requirements of the code, GDPR and PECR with you every step of the way.

That’s why our data protection advice service exists. Not only can it make GDPR compliance easier, but it can also ensure that making the code, GDPR, and the PECR part of your design process doesn’t have to be a burden.

For age-appropriate design code questions, and for broader GDPR support and advice, feel free to contact us.