Skip to main content
European Union flag
Log in
Community Message
Membership to the Community Portal is only available to Community members.
Select Accept to continue to the Login page.

Online abuse – get help, report it!

Contact a helpline

Why age matters: Shaping a better and safer digital world that protects and empowers children

A seven-year-old playing their first online game and a seventeen-year-old campaigning for climate action share the same internet — yet their needs, capacities and vulnerability to risks could not be more different. Still, they are treated the same on many online platforms. The EU’s Digital Services Act (DSA) is changing that, requiring online platform providers to consider children's protection in the design and governance of their platforms. Understanding why age matters means aligning digital environments with how children grow — cognitively, emotionally, socially, and legally.

Four people, from child to adult, in a evolution-like pose

Achieving meaningful and effective child online safety measures starts with recognising and respecting children's rights. Protecting children’s well-being, enabling their participation, and upholding their freedom of expression, freedom of thought, right to privacy, right to play and right to protection against economic exploitation are inseparable goals in the governance for a better and safer internet. The best interest of the child, one of the four cornerstone principles of the United Nations Convention on the Rights of the Child (UNCRC) and elaborated in General Comment No. 25 (GC 25), provides a guiding core value for how these rights should be realised in the digital environment. The European strategy for a better internet for kids (BIK+ strategy) embodies this rights-based approach, which is firmly anchored in the UNCRC, the Charter of Fundamental Rights of the European Union (CFREU) and the EU Strategy on the Rights of the Child and the European Child Guarantee

"In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration." (UNCRC, Art. 3(1)). 

From principle to practice: the role of age for interpreting the best interest concept 

The Digital Services Act (DSA) underlines that children’s best interests must guide very large online platforms in taking measures to mitigate the systemic risks posed by their services (Recital 89, DSA). But what best serves a seven-year-old is not the same as what empowers a seventeen-year-old. Recognising these differences — the way children’s capacities and independence develop with age — is key to making the 'best interests principle' real in practice. And this is precisely where the recently adopted guidelines on the protection of minors online under the DSA come in. The guidelines also spotlight children’s rights, explaining "to ensure that measures to achieve a high level of privacy, safety and security for minors on an online platform are appropriate and proportionate, all children’s rights should be considered and their best interests taken as a primary consideration" (DSA Art. 28 guidelines, sect. 4, para. 4(b)).  

The European Commission recently launched the 'DSA for YOUth toolkit' that helps children and young people, parents and caregivers, and teachers and educators learn about how the DSA protects minors by design. Discover the toolkit here on the BIK portal

The DSA guidelines on the protection of minors translate the legal obligation of Article 28 of the DSA to ensure a high level of privacy, safety and security for minors into practical expectations for providers of online platforms accessible to minors, and clarify that they must take 'appropriate and proportionate measures' to ensure a high level of protection for children. The guidelines also adopt the principle of age-appropriate design, meaning that safeguards must align with children's developmental, cognitive and emotional needs.

In practical terms, concrete recommendations in the guidelines include, for instance, making minors' accounts private by default, so that their personal information, data and content are hidden from people they are not connected with, and limiting platform operators' use of manipulative or addictive design features such as streaks or auto-play, which can cause excessive use. These, and other measures proposed in the DSA guidelines on the protection of minors, emphasise that the rules are not a matter of 'one-size-fits-all', but must instead be tailored to the different risks that children encounter as well as their vulnerabilities, capacities and needs as they evolve according to their developmental stage. 

How age shapes children's online experiences

Online risks and harms are not universal and can only be assessed in relation to the age, developmental stage, and maturity of the individuals in question. In the case of children and teenagers, evidence suggests roughly three important age ranges or developmental milestones that are critical to consider when seeking to implement age-appropriate online platforms, services and governance:  

"In broad terms, childhood development moves from a state of high dependency on parents or carers for security and guidance (infancy to 5 years), towards increased independence and self-care (6-11 years), through to adolescence which is a time of increasing autonomy and growing reliance on peers for approval and support (12-18 years) (…)" (Kidron & Rudkin, 2023, p. 11).

Research has demonstrated that evolving capacities, and skills and literacy development, are continuous, and not a clear-cut matter. That means that, for instance, a 15-year-old will have greater capacity to assess the reliability of online information than a 12-year-old (EU Kids Online, 2020), despite the two falling into the same developmental stage as described above. Findings like these further support the need to be ‘age-aware’ and 'age-sensitive' when it comes to creating and governing for better digital environments for children and young people, but also need to take into account the evolving capacities, skills and literacies that are not exclusively bound to an individual's age.  

In other words, even if skills and maturity vary, age still matters and remains a necessary anchor in online safety governance. It provides the primary and most practical basis for structuring rights and responsibilities online. It is essential for setting default protections, determining what content or interactions are appropriate, and implementing age-assurance measures that keep younger people online from exposure to harm while preserving older adolescents’ autonomy. 

Towards age-aware and age-appropriate child online safety

Because age shapes how children engage, learn, and are vulnerable to risks online, it should be taken into account when governing digital spaces. This notion is increasingly reflected in the EU policy landscape, from the foundational provisions of the BIK+ strategy to the recent guidelines on the protection of minors online adopted under the DSA. Both translate the recognition of the importance of age into concrete regulatory practice. Another clear indicator of the move towards an age-aware, age-appropriate child online safety is the growing focus on age assurance as an important component of policies and approaches that seek to create better, safer digital environments for children and young people. The European Commission recently published the age verification blueprint, a software solution with technical specifications for providing an accessible, privacy-preserving, user-friendly, and interoperable framework to implement age-based protections across the EU Member States. This means that providers of online platforms and services can apply, where necessary, age-based access restrictions to protect children online across the EU. The importance of the protection of minors online was once more emphasised by President of the European Commission Ursula von der Leyen during the 2025 State of the Union Address.

If you would like to learn more about the EU approach to age verification, you can read about it here on the European Commission's website and find out further details in this previous Knowledge hub insights article.

In line with these developments, this year's Safer Internet Forum zooms into the 'age factor' and the role it plays in governing for safer and better online experiences for children and young people. This year's theme, "Why age matters: Protecting and empowering youth in the digital age", discusses how age-appropriate online experiences can be ensured while respecting children's rights. Registration for online participation is now open.  

Altogether, these developments and the collective effort underline how much of a priority children's and young people's online safety is in the EU. The building blocks are now in place for an internet that recognises children not as passive users but as active rights-holders whose needs and capacities evolve with age. The challenge ahead is to ensure that the instruments and rules are implemented in ways that truly empower children, helping them to grow up protected, empowered and respected in the digital world. 

Interested in more? 

If you are interested in more policy insight, explore the BIK Knowledge hub, including the BIK Policy monitor, the Rules and guidelines and Research and reports directories, where we draw together relevant policy instruments, reports and research informing the implementation of the BIK+ strategy at the national level.  

Find more BIK Knowledge hub insights articles here.

Finally, don't forget to read the BIK Age assurance guide on our platform for more information on how different methods of age assurance work.

A seven-year-old playing their first online game and a seventeen-year-old campaigning for climate action share the same internet — yet their needs, capacities and vulnerability to risks could not be more different. Still, they are treated the same on many online platforms. The EU’s Digital Services Act (DSA) is changing that, requiring online platform providers to consider children's protection in the design and governance of their platforms. Understanding why age matters means aligning digital environments with how children grow — cognitively, emotionally, socially, and legally.

Four people, from child to adult, in a evolution-like pose

Achieving meaningful and effective child online safety measures starts with recognising and respecting children's rights. Protecting children’s well-being, enabling their participation, and upholding their freedom of expression, freedom of thought, right to privacy, right to play and right to protection against economic exploitation are inseparable goals in the governance for a better and safer internet. The best interest of the child, one of the four cornerstone principles of the United Nations Convention on the Rights of the Child (UNCRC) and elaborated in General Comment No. 25 (GC 25), provides a guiding core value for how these rights should be realised in the digital environment. The European strategy for a better internet for kids (BIK+ strategy) embodies this rights-based approach, which is firmly anchored in the UNCRC, the Charter of Fundamental Rights of the European Union (CFREU) and the EU Strategy on the Rights of the Child and the European Child Guarantee

"In all actions concerning children, whether undertaken by public or private social welfare institutions, courts of law, administrative authorities or legislative bodies, the best interests of the child shall be a primary consideration." (UNCRC, Art. 3(1)). 

From principle to practice: the role of age for interpreting the best interest concept 

The Digital Services Act (DSA) underlines that children’s best interests must guide very large online platforms in taking measures to mitigate the systemic risks posed by their services (Recital 89, DSA). But what best serves a seven-year-old is not the same as what empowers a seventeen-year-old. Recognising these differences — the way children’s capacities and independence develop with age — is key to making the 'best interests principle' real in practice. And this is precisely where the recently adopted guidelines on the protection of minors online under the DSA come in. The guidelines also spotlight children’s rights, explaining "to ensure that measures to achieve a high level of privacy, safety and security for minors on an online platform are appropriate and proportionate, all children’s rights should be considered and their best interests taken as a primary consideration" (DSA Art. 28 guidelines, sect. 4, para. 4(b)).  

The European Commission recently launched the 'DSA for YOUth toolkit' that helps children and young people, parents and caregivers, and teachers and educators learn about how the DSA protects minors by design. Discover the toolkit here on the BIK portal

The DSA guidelines on the protection of minors translate the legal obligation of Article 28 of the DSA to ensure a high level of privacy, safety and security for minors into practical expectations for providers of online platforms accessible to minors, and clarify that they must take 'appropriate and proportionate measures' to ensure a high level of protection for children. The guidelines also adopt the principle of age-appropriate design, meaning that safeguards must align with children's developmental, cognitive and emotional needs.

In practical terms, concrete recommendations in the guidelines include, for instance, making minors' accounts private by default, so that their personal information, data and content are hidden from people they are not connected with, and limiting platform operators' use of manipulative or addictive design features such as streaks or auto-play, which can cause excessive use. These, and other measures proposed in the DSA guidelines on the protection of minors, emphasise that the rules are not a matter of 'one-size-fits-all', but must instead be tailored to the different risks that children encounter as well as their vulnerabilities, capacities and needs as they evolve according to their developmental stage. 

How age shapes children's online experiences

Online risks and harms are not universal and can only be assessed in relation to the age, developmental stage, and maturity of the individuals in question. In the case of children and teenagers, evidence suggests roughly three important age ranges or developmental milestones that are critical to consider when seeking to implement age-appropriate online platforms, services and governance:  

"In broad terms, childhood development moves from a state of high dependency on parents or carers for security and guidance (infancy to 5 years), towards increased independence and self-care (6-11 years), through to adolescence which is a time of increasing autonomy and growing reliance on peers for approval and support (12-18 years) (…)" (Kidron & Rudkin, 2023, p. 11).

Research has demonstrated that evolving capacities, and skills and literacy development, are continuous, and not a clear-cut matter. That means that, for instance, a 15-year-old will have greater capacity to assess the reliability of online information than a 12-year-old (EU Kids Online, 2020), despite the two falling into the same developmental stage as described above. Findings like these further support the need to be ‘age-aware’ and 'age-sensitive' when it comes to creating and governing for better digital environments for children and young people, but also need to take into account the evolving capacities, skills and literacies that are not exclusively bound to an individual's age.  

In other words, even if skills and maturity vary, age still matters and remains a necessary anchor in online safety governance. It provides the primary and most practical basis for structuring rights and responsibilities online. It is essential for setting default protections, determining what content or interactions are appropriate, and implementing age-assurance measures that keep younger people online from exposure to harm while preserving older adolescents’ autonomy. 

Towards age-aware and age-appropriate child online safety

Because age shapes how children engage, learn, and are vulnerable to risks online, it should be taken into account when governing digital spaces. This notion is increasingly reflected in the EU policy landscape, from the foundational provisions of the BIK+ strategy to the recent guidelines on the protection of minors online adopted under the DSA. Both translate the recognition of the importance of age into concrete regulatory practice. Another clear indicator of the move towards an age-aware, age-appropriate child online safety is the growing focus on age assurance as an important component of policies and approaches that seek to create better, safer digital environments for children and young people. The European Commission recently published the age verification blueprint, a software solution with technical specifications for providing an accessible, privacy-preserving, user-friendly, and interoperable framework to implement age-based protections across the EU Member States. This means that providers of online platforms and services can apply, where necessary, age-based access restrictions to protect children online across the EU. The importance of the protection of minors online was once more emphasised by President of the European Commission Ursula von der Leyen during the 2025 State of the Union Address.

If you would like to learn more about the EU approach to age verification, you can read about it here on the European Commission's website and find out further details in this previous Knowledge hub insights article.

In line with these developments, this year's Safer Internet Forum zooms into the 'age factor' and the role it plays in governing for safer and better online experiences for children and young people. This year's theme, "Why age matters: Protecting and empowering youth in the digital age", discusses how age-appropriate online experiences can be ensured while respecting children's rights. Registration for online participation is now open.  

Altogether, these developments and the collective effort underline how much of a priority children's and young people's online safety is in the EU. The building blocks are now in place for an internet that recognises children not as passive users but as active rights-holders whose needs and capacities evolve with age. The challenge ahead is to ensure that the instruments and rules are implemented in ways that truly empower children, helping them to grow up protected, empowered and respected in the digital world. 

Interested in more? 

If you are interested in more policy insight, explore the BIK Knowledge hub, including the BIK Policy monitor, the Rules and guidelines and Research and reports directories, where we draw together relevant policy instruments, reports and research informing the implementation of the BIK+ strategy at the national level.  

Find more BIK Knowledge hub insights articles here.

Finally, don't forget to read the BIK Age assurance guide on our platform for more information on how different methods of age assurance work.