Skip to main content
European Union flag
Log in
Community Message
Membership to the Community Portal is only available to Community members.
Select Accept to continue to the Login page.

Online abuse – get help, report it!

Contact a helpline

Audiovisual Media Services Directive

The rules to protect children from seeing illegal or harmful audiovisual content and inappropriate advertising on television also apply to video-sharing platforms, like YouTube. These cover user- generated videos and, for example, advertisements promoting alcohol, tobacco, food and drinks high in fat, salt or sugar. There are also rules on product placement, television advertising and teleshopping in and around children’s programmes. The obligations for video-sharing platforms (Article 28b of AVMSD) aim to protect all users even more from certain illegal content (e.g. terrorism, child pornography). Platforms must protect children from audiovisual content which may ‘impair their physical, mental or moral development’ : this means harmful videos with violent, scary or other content not appropriate for children.

To protect children, video-sharing platforms shall, for example, offer easy ways for users to rate illegal/harmful content; easy ways for users to flag and report illegal/harmful content; parental controls and age verification systems.

When taking measures to protect children, service providers may need to handle children’s personal data. They cannot use this data for commercial purposes, like direct marketing, profiling or advertising, based on what children like or watch – or don’t like or watch - online.

(Source: EC Compendium of BIK-related legislation)

Status: Implemented | Implementation period start: 2018 | Implementation period finish: Ongoing
Record created: 01 March 2024
  • Go to document source
  • EU: directive
  • BIK+ strategy pillar 1 - protection
  • protection, video, potential-harmful-content, minors

The rules to protect children from seeing illegal or harmful audiovisual content and inappropriate advertising on television also apply to video-sharing platforms, like YouTube. These cover user- generated videos and, for example, advertisements promoting alcohol, tobacco, food and drinks high in fat, salt or sugar. There are also rules on product placement, television advertising and teleshopping in and around children’s programmes. The obligations for video-sharing platforms (Article 28b of AVMSD) aim to protect all users even more from certain illegal content (e.g. terrorism, child pornography). Platforms must protect children from audiovisual content which may ‘impair their physical, mental or moral development’ : this means harmful videos with violent, scary or other content not appropriate for children.

To protect children, video-sharing platforms shall, for example, offer easy ways for users to rate illegal/harmful content; easy ways for users to flag and report illegal/harmful content; parental controls and age verification systems.

When taking measures to protect children, service providers may need to handle children’s personal data. They cannot use this data for commercial purposes, like direct marketing, profiling or advertising, based on what children like or watch – or don’t like or watch - online.

(Source: EC Compendium of BIK-related legislation)

Status: Implemented | Implementation period start: 2018 | Implementation period finish: Ongoing
Record created: 01 March 2024
  • Go to document source
  • EU: directive
  • BIK+ strategy pillar 1 - protection
  • protection, video, potential-harmful-content, minors
© BIK
© BIK
Stay informed

Read the quarterly Better Internet for Kids bulletin for all the latest news.