The Digital Services Act (DSA) obliges large social networks to provide greater security and transparency on the internet. The Safer Internet Centre Austria provides an overview of what this means in Austria and Europe for the protection of children and young people on Instagram, TikTok and other platforms.
What is the Digital Services Act?
The Digital Services Act (DSA) is a regulation of the European Union. The aim of the regulation is to create a safe and trustworthy online environment. The DSA is intended to ensure that very large online platforms such as Instagram, Snapchat, TikTok and YouTube protect the rights of users and prevent the dissemination of prohibited or inappropriate content. The DSA obliges social networks, for example, to be more transparent about recommendation algorithms, to quickly remove harmful content and to have uniform complaint procedures throughout Europe.
The DSA came into force in November 2022. The measures had to be implemented by February 2024. The effects for users should gradually become noticeable.
How does the DSA protect minors online?
Social networks must ensure that minors are not exposed to inappropriate content, that their data and privacy are protected, that they do not receive personalised advertising, that information is clear and not misleading, and that they are better protected from online dangers overall. What this means in practice is described below:
Protection from inappropriate content
Children and young people should always feel safe online and be protected from inappropriate content and contacts. Inappropriate content or contacts are, for example, those that could cause anger, sadness, worry or fear.
The Digital Services Act (DSA) aims to ensure that online content is appropriate for the age and interests of children and young people. Online platforms are therefore obliged to identify risks at an early stage and take appropriate measures. Platforms should also ensure that social networks can only be used by users above the permitted age.
| In practice, this means: |
|
|
|
|
|
Protection from online dangers
Children and young people in particular should be protected from dangers and risks online. These include, for example, harassment, cyberbullying, misinformation and people pretending to be someone else.
In addition, platforms must explain the dangers that exist online and take measures to prevent them.
| In practice, this means: |
|
|
|
|
Data protection and privacy
We all have the right to have our personal data protected – including on the internet. This data must be stored securely and protected from unauthorised access. It must not be falsified or passed on without consent.
| In practice, this means: |
|
|
|
Protection against personalised advertising
Countless pieces of data are used for personalised advertising – what we like, who we follow and which websites we visit. The traces we leave behind online reveal our personal interests. This information is used for targeted advertising.
| In practice, this means: |
|
|
Comprehensible information and no misleading design
Platforms should be designed responsibly. They must not be addictive and must refrain from using misleading design elements (so-called dark patterns) that are intended to entice users to perform certain actions. Above all, the terms of use and privacy policies of websites and platforms must be worded in such a way that children and young people can understand them. This means that they must be written in clear, simple language without legal jargon or complicated wording.
| In practice, this means: |
|
|
|
|
The protection of minors on the internet is primarily regulated in Article 28 of the DSA. To implement these requirements, the EU Commission proposes specific measures in its guidelines on the protection of minors.
Do you want to learn more about how the DSA protects minors online by design?
Then look no further than the DSAforYOUth campaign!
In particular: the DSAforYOUth toolkit is the central hub collecting all information and easy-to-read materials, including the DSAforYOUth family-friendly booklet now available in all EU languages and Norwegian.
Is the Digital Services Act effective?
The Digital Services Act (DSA) is an important step towards putting pressure on online platforms and obliging them to take action. If an online service does not comply with the requirements, the EU can impose heavy fines – up to 6% of the company's global annual turnover.
Initial measures are already gradually becoming visible, but many platforms are still in their infancy – especially when it comes to protecting children and young people.
To ensure that improvements are also noticeable in everyday life, the rules must be consistently implemented, regularly reviewed and further developed as necessary.
The implementation of the DSA is strictly monitored – both by the European Commission and by the Member States. Each country has competent authorities that check whether platforms are fulfilling their obligations.
|
|
A social media campaign also took place on Instagram to inform the wider public about the DSA.
Find more information about the work of the Austrian Safer Internet Centre, including its awareness raising, helpline, hotline, and youth participation services, or find similar information for other Safer Internet Centres throughout Europe.
This article was originally published on the website of the Austrian Safer internet Centre and is replicated here with permission. Read the original article here.
The Digital Services Act (DSA) obliges large social networks to provide greater security and transparency on the internet. The Safer Internet Centre Austria provides an overview of what this means in Austria and Europe for the protection of children and young people on Instagram, TikTok and other platforms.
What is the Digital Services Act?
The Digital Services Act (DSA) is a regulation of the European Union. The aim of the regulation is to create a safe and trustworthy online environment. The DSA is intended to ensure that very large online platforms such as Instagram, Snapchat, TikTok and YouTube protect the rights of users and prevent the dissemination of prohibited or inappropriate content. The DSA obliges social networks, for example, to be more transparent about recommendation algorithms, to quickly remove harmful content and to have uniform complaint procedures throughout Europe.
The DSA came into force in November 2022. The measures had to be implemented by February 2024. The effects for users should gradually become noticeable.
How does the DSA protect minors online?
Social networks must ensure that minors are not exposed to inappropriate content, that their data and privacy are protected, that they do not receive personalised advertising, that information is clear and not misleading, and that they are better protected from online dangers overall. What this means in practice is described below:
Protection from inappropriate content
Children and young people should always feel safe online and be protected from inappropriate content and contacts. Inappropriate content or contacts are, for example, those that could cause anger, sadness, worry or fear.
The Digital Services Act (DSA) aims to ensure that online content is appropriate for the age and interests of children and young people. Online platforms are therefore obliged to identify risks at an early stage and take appropriate measures. Platforms should also ensure that social networks can only be used by users above the permitted age.
| In practice, this means: |
|
|
|
|
|
Protection from online dangers
Children and young people in particular should be protected from dangers and risks online. These include, for example, harassment, cyberbullying, misinformation and people pretending to be someone else.
In addition, platforms must explain the dangers that exist online and take measures to prevent them.
| In practice, this means: |
|
|
|
|
Data protection and privacy
We all have the right to have our personal data protected – including on the internet. This data must be stored securely and protected from unauthorised access. It must not be falsified or passed on without consent.
| In practice, this means: |
|
|
|
Protection against personalised advertising
Countless pieces of data are used for personalised advertising – what we like, who we follow and which websites we visit. The traces we leave behind online reveal our personal interests. This information is used for targeted advertising.
| In practice, this means: |
|
|
Comprehensible information and no misleading design
Platforms should be designed responsibly. They must not be addictive and must refrain from using misleading design elements (so-called dark patterns) that are intended to entice users to perform certain actions. Above all, the terms of use and privacy policies of websites and platforms must be worded in such a way that children and young people can understand them. This means that they must be written in clear, simple language without legal jargon or complicated wording.
| In practice, this means: |
|
|
|
|
The protection of minors on the internet is primarily regulated in Article 28 of the DSA. To implement these requirements, the EU Commission proposes specific measures in its guidelines on the protection of minors.
Do you want to learn more about how the DSA protects minors online by design?
Then look no further than the DSAforYOUth campaign!
In particular: the DSAforYOUth toolkit is the central hub collecting all information and easy-to-read materials, including the DSAforYOUth family-friendly booklet now available in all EU languages and Norwegian.
Is the Digital Services Act effective?
The Digital Services Act (DSA) is an important step towards putting pressure on online platforms and obliging them to take action. If an online service does not comply with the requirements, the EU can impose heavy fines – up to 6% of the company's global annual turnover.
Initial measures are already gradually becoming visible, but many platforms are still in their infancy – especially when it comes to protecting children and young people.
To ensure that improvements are also noticeable in everyday life, the rules must be consistently implemented, regularly reviewed and further developed as necessary.
The implementation of the DSA is strictly monitored – both by the European Commission and by the Member States. Each country has competent authorities that check whether platforms are fulfilling their obligations.
|
|
A social media campaign also took place on Instagram to inform the wider public about the DSA.
Find more information about the work of the Austrian Safer Internet Centre, including its awareness raising, helpline, hotline, and youth participation services, or find similar information for other Safer Internet Centres throughout Europe.
This article was originally published on the website of the Austrian Safer internet Centre and is replicated here with permission. Read the original article here.
- DSA (Digital Services Act)