Throughout 2023, fast formats and recommendation systems have been the main theme of three seminars for 250 students from 8th grade in three Danish cities initiated by the Center for Digital Youth Care and the Media Council for Children and Young People (Danish SIC). The continuous increase in algorithm-driven content for young people, artificial intelligence, and the new EU regulations regarding digital services and online platforms have influenced the theme of the seminars. Activities and input from the seminar's young people will be compiled into educational materials.
"Don't tamper with our feed"
The Media Council for Children and Young People and the Center for Digital Youth Care generally advocate for stronger regulation of platform business models, supporting demands for greater transparency and the option to deactivate personalised feeds. However, the three seminars immediately showed that conveying the importance of deactivating personal feeds would be a significant pedagogical task. During the seminars, a dominant opinion among the young people was that they did not want more adult intervention in fast content formats like TikTok, Shorts, and Reels.
"It's really cool that it knows me so well. Why would I opt out?"
expressed several young people.
Most of them consider it potentially detrimental to the content if it became more "safe and adult-controlled". At the same time, young people take on a huge personal responsibility for the content they get. When algorithms choose content for children that crosses a line, the prevalent response is that "you can just click 'not interested' in the content” or quickly scroll past. Several young people point out that it makes a difference to them whether the people shown in their feed have shared the content themselves or if others have done it. To form their opinion on this, they often quickly share the content with friends for discussion. Only a few think there is merit in reporting or imposing more platform requirements to offer less boundary-pushing content.
Everyone said no when we asked if they would voluntarily choose to opt out of the recommendation system. Algorithmic control is, on one hand, diffuse and difficult to relate to, and on the other hand, a factor that many young people describe as a sensitive mechanism precisely tuned to serve content that best fits them. In principle, they understand the mechanism of personalised content, but it is complicated to understand exactly what happens in the engine room and its commercial implications.
New regulations can create clarity about the use of algorithms
With the EU's Digital Services Act (DSA), new legal requirements are imposed on major online platforms regarding how they handle transparency around their algorithms. DSA came into effect on the 25th of August, 2023, to create a safer and more responsible online environment and protect users' rights, focusing on children and young people. This creates much greater clarity about the functionality and purpose of algorithms. The new rules also require platforms to offer a "non-personalised feed," where users' displayed content is not based on personal information, search history, and preferences.
Soon, an easily accessible pixie version of the key points of DSA legislation, aimed at both children and adults, will be released. It will be available through the Media Council's and the Center for Digital Youth Care's websites.
Educational material with knowledge, dilemmas, and opinions
The upcoming educational material will address young people's attitudes and inform them about the importance of understanding the underlying mechanisms behind personalised recommendation algorithms. This will also include information about EU legislation regarding digital services and online platforms and its implications, especially for children and young people. Students will be presented with information, dilemmas, and opinions inviting classroom discussion.
The material targets middle school students and will be freely available (in Danish) on the Media Council's website and the BIK platform resource gallery by the beginning of 2024.
The Media Council and the Center for Digital Youth Care would like to thank all participating young people for their input on the educational material and crime prevention workers from the three Danish cities for collaborating on the youth seminars.
Find out more about Safer Internet Day in Denmark. Alternatively, find more information about the work of the Danish Safer Internet Centre, including their awareness raising, helpline, hotline and youth participation services – or find similar information for other Safer Internet Centres throughout Europe.
Throughout 2023, fast formats and recommendation systems have been the main theme of three seminars for 250 students from 8th grade in three Danish cities initiated by the Center for Digital Youth Care and the Media Council for Children and Young People (Danish SIC). The continuous increase in algorithm-driven content for young people, artificial intelligence, and the new EU regulations regarding digital services and online platforms have influenced the theme of the seminars. Activities and input from the seminar's young people will be compiled into educational materials.
"Don't tamper with our feed"
The Media Council for Children and Young People and the Center for Digital Youth Care generally advocate for stronger regulation of platform business models, supporting demands for greater transparency and the option to deactivate personalised feeds. However, the three seminars immediately showed that conveying the importance of deactivating personal feeds would be a significant pedagogical task. During the seminars, a dominant opinion among the young people was that they did not want more adult intervention in fast content formats like TikTok, Shorts, and Reels.
"It's really cool that it knows me so well. Why would I opt out?"
expressed several young people.
Most of them consider it potentially detrimental to the content if it became more "safe and adult-controlled". At the same time, young people take on a huge personal responsibility for the content they get. When algorithms choose content for children that crosses a line, the prevalent response is that "you can just click 'not interested' in the content” or quickly scroll past. Several young people point out that it makes a difference to them whether the people shown in their feed have shared the content themselves or if others have done it. To form their opinion on this, they often quickly share the content with friends for discussion. Only a few think there is merit in reporting or imposing more platform requirements to offer less boundary-pushing content.
Everyone said no when we asked if they would voluntarily choose to opt out of the recommendation system. Algorithmic control is, on one hand, diffuse and difficult to relate to, and on the other hand, a factor that many young people describe as a sensitive mechanism precisely tuned to serve content that best fits them. In principle, they understand the mechanism of personalised content, but it is complicated to understand exactly what happens in the engine room and its commercial implications.
New regulations can create clarity about the use of algorithms
With the EU's Digital Services Act (DSA), new legal requirements are imposed on major online platforms regarding how they handle transparency around their algorithms. DSA came into effect on the 25th of August, 2023, to create a safer and more responsible online environment and protect users' rights, focusing on children and young people. This creates much greater clarity about the functionality and purpose of algorithms. The new rules also require platforms to offer a "non-personalised feed," where users' displayed content is not based on personal information, search history, and preferences.
Soon, an easily accessible pixie version of the key points of DSA legislation, aimed at both children and adults, will be released. It will be available through the Media Council's and the Center for Digital Youth Care's websites.
Educational material with knowledge, dilemmas, and opinions
The upcoming educational material will address young people's attitudes and inform them about the importance of understanding the underlying mechanisms behind personalised recommendation algorithms. This will also include information about EU legislation regarding digital services and online platforms and its implications, especially for children and young people. Students will be presented with information, dilemmas, and opinions inviting classroom discussion.
The material targets middle school students and will be freely available (in Danish) on the Media Council's website and the BIK platform resource gallery by the beginning of 2024.
The Media Council and the Center for Digital Youth Care would like to thank all participating young people for their input on the educational material and crime prevention workers from the three Danish cities for collaborating on the youth seminars.
Find out more about Safer Internet Day in Denmark. Alternatively, find more information about the work of the Danish Safer Internet Centre, including their awareness raising, helpline, hotline and youth participation services – or find similar information for other Safer Internet Centres throughout Europe.
- Related content
- DSA (Digital Services Act) TikTok algorithms
- < Previous article
- Next article >