What is CSAM?
Child sexual abuse material (CSAM) can be described as imagery or video of a person under the age of 18 engaged in or depicted as being engaged in explicit sexual activity.
Understanding the complexities of CSAM
Often adult/legal pornographic content is tagged explicitly with "teen", "barely 18" or similar terms, but after viewing if it is clear that the persons shown in the material are 18 years or older. In these cases the content is not CSAM and thus should not be reported to your national hotline.
If you are in any doubt about any material you find please report it to your national hotline where it will be assessed by a professional Hotline analyst.
In many countries, images of children who have been instructed to pose in sexualised ways, completely or partially undressed, and images which are focused on children's sexual organs are illegal and should be reported to your national hotline.
Because what is considered CSAM differs according to the country, INHOPE recommends finding out more about what should be reported in your country by visiting your national hotline's website.
If you are still in doubt, then it is always better to report it. Your report can then be checked by a professional who has the opportunity to end the cycle of abuse.
Think before you hashtag
Hashtags such as #cleankids #splishsplash #pottytraining #naptime are being used by sexual predators to find pictures of children, according to a new study by Child Rescue Coalition.
By indicating keywords or topics of interest, hashtagging an image enters it into a directory created by the social network and makes the image discoverable by other users. These hashtags may be used by parents when posting entirely innocent photos of their children, or by teenagers when posting selfies, but they could also be exploited by people with a sexual interest in children.
Innocent images are often copied, manipulated and misused which as a Dutch analyst explained that some of the CSAM they see regularly includes zoomed in images of the genitals of children playing on the beach.
Images posted online stay online forever. INHOPE encourages everyone to think before you post. Ask yourself, would your child would be happy with this image being online, available to the world when they are 30 years old?
Check that the privacy settings on all your family members social media accounts are appropriately configured – ideally with maximum security. And, be extremely cautious in applying hashtags to pictures of children to minimise the potential of those images being found by people with a sexual interest in children.
Help us achieve our vision of an internet free of CSAM and keep your child safe in this back to school season. Don't ignore it, report it.
What is CSAM?
Child sexual abuse material (CSAM) can be described as imagery or video of a person under the age of 18 engaged in or depicted as being engaged in explicit sexual activity.
Understanding the complexities of CSAM
Often adult/legal pornographic content is tagged explicitly with "teen", "barely 18" or similar terms, but after viewing if it is clear that the persons shown in the material are 18 years or older. In these cases the content is not CSAM and thus should not be reported to your national hotline.
If you are in any doubt about any material you find please report it to your national hotline where it will be assessed by a professional Hotline analyst.
In many countries, images of children who have been instructed to pose in sexualised ways, completely or partially undressed, and images which are focused on children's sexual organs are illegal and should be reported to your national hotline.
Because what is considered CSAM differs according to the country, INHOPE recommends finding out more about what should be reported in your country by visiting your national hotline's website.
If you are still in doubt, then it is always better to report it. Your report can then be checked by a professional who has the opportunity to end the cycle of abuse.
Think before you hashtag
Hashtags such as #cleankids #splishsplash #pottytraining #naptime are being used by sexual predators to find pictures of children, according to a new study by Child Rescue Coalition.
By indicating keywords or topics of interest, hashtagging an image enters it into a directory created by the social network and makes the image discoverable by other users. These hashtags may be used by parents when posting entirely innocent photos of their children, or by teenagers when posting selfies, but they could also be exploited by people with a sexual interest in children.
Innocent images are often copied, manipulated and misused which as a Dutch analyst explained that some of the CSAM they see regularly includes zoomed in images of the genitals of children playing on the beach.
Images posted online stay online forever. INHOPE encourages everyone to think before you post. Ask yourself, would your child would be happy with this image being online, available to the world when they are 30 years old?
Check that the privacy settings on all your family members social media accounts are appropriately configured – ideally with maximum security. And, be extremely cautious in applying hashtags to pictures of children to minimise the potential of those images being found by people with a sexual interest in children.
Help us achieve our vision of an internet free of CSAM and keep your child safe in this back to school season. Don't ignore it, report it.
- Related content
- CSAM (child sexual abuse material) child sexual exploitation (CSE) hotline reporting system
- < Previous article
- Next article >