Skip to main content
European Union flag
Log in
Community Message
Membership to the Community Portal is only available to Community members.
Select Accept to continue to the Login page.

Online abuse – get help, report it!

Contact a helpline

How the Digital Services Act protects minors on social networks

The Digital Services Act (DSA) obliges large social networks to provide greater security and transparency on the internet. The Safer Internet Centre Austria provides an overview of what this means in Austria and Europe for the protection of children and young people on Instagram, TikTok and other platforms.

A smartphone screen showing the DSA and the EU flag stars around it

What is the Digital Services Act?  

The Digital Services Act (DSA) is a regulation of the European Union. The aim of the regulation is to create a safe and trustworthy online environment. The DSA is intended to ensure that very large online platforms such as Instagram, Snapchat, TikTok and YouTube protect the rights of users and prevent the dissemination of prohibited or inappropriate content. The DSA obliges social networks, for example, to be more transparent about recommendation algorithms, to quickly remove harmful content and to have uniform complaint procedures throughout Europe.  

The DSA came into force in November 2022. The measures had to be implemented by February 2024. The effects for users should gradually become noticeable. 

How does the DSA protect minors online?  

Social networks must ensure that minors are not exposed to inappropriate content, that their data and privacy are protected, that they do not receive personalised advertising, that information is clear and not misleading, and that they are better protected from online dangers overall. What this means in practice is described below: 

Protection from inappropriate content  

Children and young people should always feel safe online and be protected from inappropriate content and contacts. Inappropriate content or contacts are, for example, those that could cause anger, sadness, worry or fear. 

The Digital Services Act (DSA) aims to ensure that online content is appropriate for the age and interests of children and young people. Online platforms are therefore obliged to identify risks at an early stage and take appropriate measures. Platforms should also ensure that social networks can only be used by users above the permitted age. 

In practice, this means: 
  • Social networks must design their recommendation algorithms in such a way that children and young people do not come into contact with harmful content
  • Platforms must offer easily accessible reporting functions, for example via a clearly visible report button. Platforms are obliged to respond promptly and provide feedback to users. 
  • If a platform does not respond quickly enough or if the content is particularly dangerous, so-called trusted flaggers can offer support. Trusted flaggers are independent organisations that are in direct contact with the platforms and can speed up the removal of problematic content. In Austria, Rat auf Draht and the Internet Ombudsstelle take on this role as trusted flaggers. 
  • Social networks offer settings that minimise or, in the best case, prevent the risk of seeing inappropriate content. Examples include not allowing users to be added to chat groups without their consent, allowing them to reset the recommendation algorithm or restrict content. 
  • Platforms must verify the age of their users. Currently, this is done through self-disclosure and the evaluation of user behaviour (such as language, content, usage times) to obtain clues about the actual age.  

Protection from online dangers  

Children and young people in particular should be protected from dangers and risks online. These include, for example, harassment, cyberbullying, misinformation and people pretending to be someone else. 

In addition, platforms must explain the dangers that exist online and take measures to prevent them. 

In practice, this means:  
  • Platforms offer settings and tools that parents can use to supervise their children's use or block certain functions. 
  • To protect minors, certain settings are provided by default, such as "private" accounts to prevent contact from strangers, restricted contact options or a deactivated livestream function. 
  • Platforms offer settings to mute or block users. 
  • Platforms work with fact-checking organisations and provide warnings for questionable or false content. 

Data protection and privacy  

We all have the right to have our personal data protected – including on the internet. This data must be stored securely and protected from unauthorised access. It must not be falsified or passed on without consent. 

In practice, this means:  
  • There should be default settings to protect privacy. For example, profiles of minors should be set to "private" by default so that their posts are not publicly accessible. In addition, app permissions should be set to privacy-friendly defaults so that there is no automatic access to the microphone, location, camera, etc. 
  • Age verification must be designed in a way that protects data privacy. One option would be to integrate the European digital identity (eID). This would allow platforms to verify the accuracy of the age provided without requiring the submission of identity documents or data. Social networks should not collect any additional data for the purpose of age verification. 
  • Sensitive data must not be used for advertising or content recommendations.  

Protection against personalised advertising  

Countless pieces of data are used for personalised advertising – what we like, who we follow and which websites we visit. The traces we leave behind online reveal our personal interests. This information is used for targeted advertising.  

In practice, this means:  
  • Online platforms may not show personalised advertising to minors, i.e. advertising based on interests, search history or other usage data. 
  • Social networks are obliged to disclose transparently how advertisements are placed. This includes information about advertising campaigns, target groups and the algorithms used. This should enable researchers, experts and authorities to identify risks such as misinformation, misleading advertising or prohibited content in advertisements.  

Comprehensible information and no misleading design  

Platforms should be designed responsibly. They must not be addictive and must refrain from using misleading design elements (so-called dark patterns) that are intended to entice users to perform certain actions. Above all, the terms of use and privacy policies of websites and platforms must be worded in such a way that children and young people can understand them. This means that they must be written in clear, simple language without legal jargon or complicated wording.  

In practice, this means:
  • Design elements that increase usage time should be removed. 
  • Features that prolong the time spent on platforms or lure users to the platform should be deactivated (e.g. digital rewards for regular use or communication, autoplay or push notifications). 
  • Terms of use and privacy policies should be presented in an understandable form. 
  • Children must be informed in an easily-understandable way about their rights, obligations and the possible risks of the platform. 

The protection of minors on the internet is primarily regulated in Article 28 of the DSA. To implement these requirements, the EU Commission proposes specific measures in its guidelines on the protection of minors.  

Do you want to learn more about how the DSA protects minors online by design? 
Then look no further than the DSAforYOUth campaign
In particular: the DSAforYOUth toolkit is the central hub collecting all information and easy-to-read materials, including the DSAforYOUth family-friendly booklet now available in all EU languages and Norwegian.

Is the Digital Services Act effective?  

The Digital Services Act (DSA) is an important step towards putting pressure on online platforms and obliging them to take action. If an online service does not comply with the requirements, the EU can impose heavy fines – up to 6% of the company's global annual turnover.  

Initial measures are already gradually becoming visible, but many platforms are still in their infancy – especially when it comes to protecting children and young people. 

To ensure that improvements are also noticeable in everyday life, the rules must be consistently implemented, regularly reviewed and further developed as necessary. 

The implementation of the DSA is strictly monitored – both by the European Commission and by the Member States. Each country has competent authorities that check whether platforms are fulfilling their obligations. 

A social media campaign also took place on Instagram to inform the wider public about the DSA.  

Find more information about the work of the Austrian Safer Internet Centre, including its awareness raising, helpline, hotline, and youth participation services, or find similar information for other Safer Internet Centres throughout Europe.

This article was originally published on the website of the Austrian Safer internet Centre and is replicated here with permission. Read the original article here

The Digital Services Act (DSA) obliges large social networks to provide greater security and transparency on the internet. The Safer Internet Centre Austria provides an overview of what this means in Austria and Europe for the protection of children and young people on Instagram, TikTok and other platforms.

A smartphone screen showing the DSA and the EU flag stars around it

What is the Digital Services Act?  

The Digital Services Act (DSA) is a regulation of the European Union. The aim of the regulation is to create a safe and trustworthy online environment. The DSA is intended to ensure that very large online platforms such as Instagram, Snapchat, TikTok and YouTube protect the rights of users and prevent the dissemination of prohibited or inappropriate content. The DSA obliges social networks, for example, to be more transparent about recommendation algorithms, to quickly remove harmful content and to have uniform complaint procedures throughout Europe.  

The DSA came into force in November 2022. The measures had to be implemented by February 2024. The effects for users should gradually become noticeable. 

How does the DSA protect minors online?  

Social networks must ensure that minors are not exposed to inappropriate content, that their data and privacy are protected, that they do not receive personalised advertising, that information is clear and not misleading, and that they are better protected from online dangers overall. What this means in practice is described below: 

Protection from inappropriate content  

Children and young people should always feel safe online and be protected from inappropriate content and contacts. Inappropriate content or contacts are, for example, those that could cause anger, sadness, worry or fear. 

The Digital Services Act (DSA) aims to ensure that online content is appropriate for the age and interests of children and young people. Online platforms are therefore obliged to identify risks at an early stage and take appropriate measures. Platforms should also ensure that social networks can only be used by users above the permitted age. 

In practice, this means: 
  • Social networks must design their recommendation algorithms in such a way that children and young people do not come into contact with harmful content
  • Platforms must offer easily accessible reporting functions, for example via a clearly visible report button. Platforms are obliged to respond promptly and provide feedback to users. 
  • If a platform does not respond quickly enough or if the content is particularly dangerous, so-called trusted flaggers can offer support. Trusted flaggers are independent organisations that are in direct contact with the platforms and can speed up the removal of problematic content. In Austria, Rat auf Draht and the Internet Ombudsstelle take on this role as trusted flaggers. 
  • Social networks offer settings that minimise or, in the best case, prevent the risk of seeing inappropriate content. Examples include not allowing users to be added to chat groups without their consent, allowing them to reset the recommendation algorithm or restrict content. 
  • Platforms must verify the age of their users. Currently, this is done through self-disclosure and the evaluation of user behaviour (such as language, content, usage times) to obtain clues about the actual age.  

Protection from online dangers  

Children and young people in particular should be protected from dangers and risks online. These include, for example, harassment, cyberbullying, misinformation and people pretending to be someone else. 

In addition, platforms must explain the dangers that exist online and take measures to prevent them. 

In practice, this means:  
  • Platforms offer settings and tools that parents can use to supervise their children's use or block certain functions. 
  • To protect minors, certain settings are provided by default, such as "private" accounts to prevent contact from strangers, restricted contact options or a deactivated livestream function. 
  • Platforms offer settings to mute or block users. 
  • Platforms work with fact-checking organisations and provide warnings for questionable or false content. 

Data protection and privacy  

We all have the right to have our personal data protected – including on the internet. This data must be stored securely and protected from unauthorised access. It must not be falsified or passed on without consent. 

In practice, this means:  
  • There should be default settings to protect privacy. For example, profiles of minors should be set to "private" by default so that their posts are not publicly accessible. In addition, app permissions should be set to privacy-friendly defaults so that there is no automatic access to the microphone, location, camera, etc. 
  • Age verification must be designed in a way that protects data privacy. One option would be to integrate the European digital identity (eID). This would allow platforms to verify the accuracy of the age provided without requiring the submission of identity documents or data. Social networks should not collect any additional data for the purpose of age verification. 
  • Sensitive data must not be used for advertising or content recommendations.  

Protection against personalised advertising  

Countless pieces of data are used for personalised advertising – what we like, who we follow and which websites we visit. The traces we leave behind online reveal our personal interests. This information is used for targeted advertising.  

In practice, this means:  
  • Online platforms may not show personalised advertising to minors, i.e. advertising based on interests, search history or other usage data. 
  • Social networks are obliged to disclose transparently how advertisements are placed. This includes information about advertising campaigns, target groups and the algorithms used. This should enable researchers, experts and authorities to identify risks such as misinformation, misleading advertising or prohibited content in advertisements.  

Comprehensible information and no misleading design  

Platforms should be designed responsibly. They must not be addictive and must refrain from using misleading design elements (so-called dark patterns) that are intended to entice users to perform certain actions. Above all, the terms of use and privacy policies of websites and platforms must be worded in such a way that children and young people can understand them. This means that they must be written in clear, simple language without legal jargon or complicated wording.  

In practice, this means:
  • Design elements that increase usage time should be removed. 
  • Features that prolong the time spent on platforms or lure users to the platform should be deactivated (e.g. digital rewards for regular use or communication, autoplay or push notifications). 
  • Terms of use and privacy policies should be presented in an understandable form. 
  • Children must be informed in an easily-understandable way about their rights, obligations and the possible risks of the platform. 

The protection of minors on the internet is primarily regulated in Article 28 of the DSA. To implement these requirements, the EU Commission proposes specific measures in its guidelines on the protection of minors.  

Do you want to learn more about how the DSA protects minors online by design? 
Then look no further than the DSAforYOUth campaign
In particular: the DSAforYOUth toolkit is the central hub collecting all information and easy-to-read materials, including the DSAforYOUth family-friendly booklet now available in all EU languages and Norwegian.

Is the Digital Services Act effective?  

The Digital Services Act (DSA) is an important step towards putting pressure on online platforms and obliging them to take action. If an online service does not comply with the requirements, the EU can impose heavy fines – up to 6% of the company's global annual turnover.  

Initial measures are already gradually becoming visible, but many platforms are still in their infancy – especially when it comes to protecting children and young people. 

To ensure that improvements are also noticeable in everyday life, the rules must be consistently implemented, regularly reviewed and further developed as necessary. 

The implementation of the DSA is strictly monitored – both by the European Commission and by the Member States. Each country has competent authorities that check whether platforms are fulfilling their obligations. 

A social media campaign also took place on Instagram to inform the wider public about the DSA.  

Find more information about the work of the Austrian Safer Internet Centre, including its awareness raising, helpline, hotline, and youth participation services, or find similar information for other Safer Internet Centres throughout Europe.

This article was originally published on the website of the Austrian Safer internet Centre and is replicated here with permission. Read the original article here

Related content

DSA (Digital Services Act)
Newsletter subscription Newsletter subscription

Stay informed

Read the quarterly Better Internet for Kids bulletin for all the latest news.