Showing 1 - 6 out of 6 results
Recommendations for a cybersecure back to school 2025
Get ready for a cybersecure back-to-school!
The return to school for the younger students is an opportunity to commit to adopting good habits and instilling in them the proper use of technology both inside and outside the classroom. Families and educators return to the routine with them, and it is in their hands to teach and provide them with proper digital education.
Guidelines – Age Classification of Audiovisual Programmes
The "Guidelines – Age Classification of Audiovisual Programmes" by Medietilsynet (Norwegian Media Authority) outline the process for classifying audiovisual programmes to protect minors from harmful content. Mandated by the Audiovisual Programme Act, all programmes made available to the general public, with specific exemptions, require an age limit. The assessment considers a programme's assumed impact on various age groups, evaluating its expression (e.g., mood, characters) and content (e.g., violence, sexuality, difficult themes).
Audiovisual Media Services Directive
The rules to protect children from seeing illegal or harmful audiovisual content and inappropriate advertising on television also apply to video-sharing platforms, like YouTube. These cover user- generated videos and, for example, advertisements promoting alcohol, tobacco, food and drinks high in fat, salt or sugar. There are also rules on product placement, television advertising and teleshopping in and around children’s programmes. The obligations for video-sharing platforms (Article 28b of AVMSD) aim to protect all users even more from certain illegal content (e.g.
Digital Services Act (DSA) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act)
The Digital Services Act aims to create a safer digital space where the fundamental rights of users are protected. Under Art 28, providers of online platforms accessible to minors must put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service. The Digital Services Act imposes on all providers of digital services obligations to protect minors from illegal content online. They must write their terms and conditions in a way that children can understand.
Technical child protection on the internet
Although promoting media literacy among young users is the top priority, the use of technical measures on the digital devices of younger children can be helpful. The range of technical child protection measures is vast, and they typically combine various functions to keep inappropriate content away from minors. This usually involves setting usage times, filtering content, and blocking certain applications.
“Can beauty ideals online harm my child?”
The issue of beauty ideals online and their impact on children and young people is complex. Research shows that exposure to beauty and body ideals on social media can lead to a poorer body image—to varying degrees. Research also shows that children (age 10-18) that spend a lot of time on social media tend to be more unhappy with their bodies and suffer from eating disorders to a greater extent than others in their age. In this short parental guide you can read more about the risks and get seven tips on how to talk to your child and give support.


