Skip to main content
European Union flag
Log in
Community Message
Membership to the Community Portal is only available to Community members.
Select Accept to continue to the Login page.
Better Internet for Kids

Need help dealing with an online issue or harmful/illegal content?

Learn more

Recognising child sexual abuse material (CSAM)

With September well underway, many of us are returning to more time spent on laptops and online. The INHOPE network of internet hotlines works hard to remove illegal content and to make sure that you and your children don't come across images and videos of child sexual abuse. To help in this fight, and to provide support in this back to school season, INHOPE wants to make sure that everyone is informed on what child sexual abuse material (CSAM) is and what kind of content should be reported. INHOPE also offers tips on how to avoid photos of children and young people ending up in the hands of predators.
Mother taking a selfie with a young child

What is CSAM?

Child sexual abuse material (CSAM) can be described as imagery or video of a person under the age of 18 engaged in or depicted as being engaged in explicit sexual activity.

Understanding the complexities of CSAM

Often adult/legal pornographic content is tagged explicitly with "teen", "barely 18" or similar terms, but after viewing if it is clear that the persons shown in the material are 18 years or older. In these cases the content is not CSAM and thus should not be reported to your national hotline.

If you are in any doubt about any material you find please report it to your national hotline where it will be assessed by a professional Hotline analyst.

In many countries, images of children who have been instructed to pose in sexualised ways, completely or partially undressed, and images which are focused on children's sexual organs are illegal and should be reported to your national hotline.

Because what is considered CSAM differs according to the country, INHOPE recommends finding out more about what should be reported in your country by visiting your national hotline's website.

If you are still in doubt, then it is always better to report it. Your report can then be checked by a professional who has the opportunity to end the cycle of abuse.

Think before you hashtag

Hashtags such as #cleankids #splishsplash #pottytraining #naptime are being used by sexual predators to find pictures of children, according to a new study by Child Rescue Coalition.

By indicating keywords or topics of interest, hashtagging an image enters it into a directory created by the social network and makes the image discoverable by other users. These hashtags may be used by parents when posting entirely innocent photos of their children, or by teenagers when posting selfies, but they could also be exploited by people with a sexual interest in children.

Innocent images are often copied,  manipulated and misused which as a Dutch analyst explained that some of the CSAM they see regularly includes zoomed in images of the genitals of children playing on the beach.

Images posted online stay online forever. INHOPE encourages everyone to think before you post. Ask yourself, would your child would be happy with this image being online, available to the world when they are 30 years old?

Check that the privacy settings on all your family members social media accounts are appropriately configured – ideally with maximum security. And, be extremely cautious in applying hashtags to pictures of children to minimise the potential of those images being found by people with a sexual interest in children.

Help us achieve our vision of an internet free of CSAM and keep your child safe in this back to school season. Don't ignore it, report it.

With September well underway, many of us are returning to more time spent on laptops and online. The INHOPE network of internet hotlines works hard to remove illegal content and to make sure that you and your children don't come across images and videos of child sexual abuse. To help in this fight, and to provide support in this back to school season, INHOPE wants to make sure that everyone is informed on what child sexual abuse material (CSAM) is and what kind of content should be reported. INHOPE also offers tips on how to avoid photos of children and young people ending up in the hands of predators.
Mother taking a selfie with a young child

What is CSAM?

Child sexual abuse material (CSAM) can be described as imagery or video of a person under the age of 18 engaged in or depicted as being engaged in explicit sexual activity.

Understanding the complexities of CSAM

Often adult/legal pornographic content is tagged explicitly with "teen", "barely 18" or similar terms, but after viewing if it is clear that the persons shown in the material are 18 years or older. In these cases the content is not CSAM and thus should not be reported to your national hotline.

If you are in any doubt about any material you find please report it to your national hotline where it will be assessed by a professional Hotline analyst.

In many countries, images of children who have been instructed to pose in sexualised ways, completely or partially undressed, and images which are focused on children's sexual organs are illegal and should be reported to your national hotline.

Because what is considered CSAM differs according to the country, INHOPE recommends finding out more about what should be reported in your country by visiting your national hotline's website.

If you are still in doubt, then it is always better to report it. Your report can then be checked by a professional who has the opportunity to end the cycle of abuse.

Think before you hashtag

Hashtags such as #cleankids #splishsplash #pottytraining #naptime are being used by sexual predators to find pictures of children, according to a new study by Child Rescue Coalition.

By indicating keywords or topics of interest, hashtagging an image enters it into a directory created by the social network and makes the image discoverable by other users. These hashtags may be used by parents when posting entirely innocent photos of their children, or by teenagers when posting selfies, but they could also be exploited by people with a sexual interest in children.

Innocent images are often copied,  manipulated and misused which as a Dutch analyst explained that some of the CSAM they see regularly includes zoomed in images of the genitals of children playing on the beach.

Images posted online stay online forever. INHOPE encourages everyone to think before you post. Ask yourself, would your child would be happy with this image being online, available to the world when they are 30 years old?

Check that the privacy settings on all your family members social media accounts are appropriately configured – ideally with maximum security. And, be extremely cautious in applying hashtags to pictures of children to minimise the potential of those images being found by people with a sexual interest in children.

Help us achieve our vision of an internet free of CSAM and keep your child safe in this back to school season. Don't ignore it, report it.

Related content
CSAM (child sexual abuse material) child sexual exploitation (CSE) hotline reporting system