In 2025, EU co funded hotlines received over 4.5 million reports of potentially illegal online content, highlighting both the scale of online harms reported by the public and the essential role hotlines play in identifying and addressing them.
Data collected through the BIK Hotline Observatory provides an overview of reporting patterns, emerging challenges, and operational developments across the European hotline network. Of the 4,546,694 reports received across the year, 2,551,833 related to child sexual abuse material or child sexual exploitation material, demonstrating the continued prevalence of this form of illegal content online.
Trusted Flaggers under the Digital Services Act
A major development during 2025 has been the continued rollout of Trusted Flagger designations under the Digital Services Act. Trusted Flaggers are entities recognised by national Digital Services Coordinators and granted priority channels to notify platforms about illegal content.
Over the course of the year, the number of hotlines formally designated as Trusted Flaggers increased, reaching 13 organisations across the network. These designations strengthen structured reporting channels between hotlines and online platforms and enable faster notification and removal processes for illegal content.
As implementation progressed, Trusted Flagger activity increased significantly, with 26,096 notices notices submitted to digital service providers. Hotlines also reported shifts in reporting patterns following their designation, including a noticeable rise in international submissions. In many cases, reporters referenced the Digital Services Act when submitting reports and requesting action.
Several organisations also highlighted an increase in reports relating to messaging platforms such as Telegram and Discord, including cases involving sextortion and other forms of online sexual exploitation.
Reporting volumes and operational trends
Reporting volumes fluctuated significantly across the year. Early quarters saw exceptionally high reporting volumes, driven in part by large scale reporting activity and changes in how offenders distribute material online.
One observed shift has been the increasing practice of uploading individual frames extracted from videos rather than composite images containing multiple frames. This has significantly increased the number of reportable items appearing on individual pages and contributed to the rise in reported URLs.
At the same time, improvements to automated analysis tools used by hotlines have increased the speed at which reports can be processed and analysed, enabling more efficient identification of illegal content.
Child sexual abuse material remains a major focus
In total, 2,551,833 CSAM or CSEM related reports were recorded across the year. These reports can include:
- self-generated CSAM or CSEM
- sexualised posing, sexualised child modelling, or other inappropriate child related images
- computer-generated or virtual depictions such as manga, drawings, or animations where these are illegal under national legislation
- text depictions of CSAM.
Platforms and hosting environments
Websites continued to be the primary location for illegal content reported to hotlines. However, analysts also encountered material hosted across a wide range of online environments, including file sharing services, forums, online gaming platforms, and messaging services.
Hotlines highlighted ongoing challenges in assessing reports relating to closed or private groups, particularly on encrypted messaging platforms. In these cases, analysts may not have direct access to the reported material and must rely on contextual information provided by the reporting party or collaborate with platforms and law enforcement authorities to assess the content.
Cooperation with law enforcement and industry
Throughout 2025, hotlines continued to work closely with law enforcement agencies, internet service providers, and online platforms to ensure that illegal material can be assessed and removed as quickly as possible.
Across the year, 343,252 reports were forwarded to law enforcement authorities and 129,666 reports were forwarded to internet service providers for further action. Hotlines also uploaded 3,403,749 reports into ICCAM, the secure platform that enables cross border exchange of reports between hotlines worldwide.
Supporting a safer digital environment
The findings from the 2025 Hotline Observatory highlight the essential role hotlines play within the European digital safety ecosystem. Through the identification, assessment, and reporting of illegal content, hotlines support faster removal of harmful material and strengthen cooperation between civil society, industry, and law enforcement.
As regulatory frameworks such as the Digital Services Act continue to take effect, the work of hotlines will remain central to ensuring a safer digital environment for children and young people across Europe.
In 2025, EU co funded hotlines received over 4.5 million reports of potentially illegal online content, highlighting both the scale of online harms reported by the public and the essential role hotlines play in identifying and addressing them.
Data collected through the BIK Hotline Observatory provides an overview of reporting patterns, emerging challenges, and operational developments across the European hotline network. Of the 4,546,694 reports received across the year, 2,551,833 related to child sexual abuse material or child sexual exploitation material, demonstrating the continued prevalence of this form of illegal content online.
Trusted Flaggers under the Digital Services Act
A major development during 2025 has been the continued rollout of Trusted Flagger designations under the Digital Services Act. Trusted Flaggers are entities recognised by national Digital Services Coordinators and granted priority channels to notify platforms about illegal content.
Over the course of the year, the number of hotlines formally designated as Trusted Flaggers increased, reaching 13 organisations across the network. These designations strengthen structured reporting channels between hotlines and online platforms and enable faster notification and removal processes for illegal content.
As implementation progressed, Trusted Flagger activity increased significantly, with 26,096 notices notices submitted to digital service providers. Hotlines also reported shifts in reporting patterns following their designation, including a noticeable rise in international submissions. In many cases, reporters referenced the Digital Services Act when submitting reports and requesting action.
Several organisations also highlighted an increase in reports relating to messaging platforms such as Telegram and Discord, including cases involving sextortion and other forms of online sexual exploitation.
Reporting volumes and operational trends
Reporting volumes fluctuated significantly across the year. Early quarters saw exceptionally high reporting volumes, driven in part by large scale reporting activity and changes in how offenders distribute material online.
One observed shift has been the increasing practice of uploading individual frames extracted from videos rather than composite images containing multiple frames. This has significantly increased the number of reportable items appearing on individual pages and contributed to the rise in reported URLs.
At the same time, improvements to automated analysis tools used by hotlines have increased the speed at which reports can be processed and analysed, enabling more efficient identification of illegal content.
Child sexual abuse material remains a major focus
In total, 2,551,833 CSAM or CSEM related reports were recorded across the year. These reports can include:
- self-generated CSAM or CSEM
- sexualised posing, sexualised child modelling, or other inappropriate child related images
- computer-generated or virtual depictions such as manga, drawings, or animations where these are illegal under national legislation
- text depictions of CSAM.
Platforms and hosting environments
Websites continued to be the primary location for illegal content reported to hotlines. However, analysts also encountered material hosted across a wide range of online environments, including file sharing services, forums, online gaming platforms, and messaging services.
Hotlines highlighted ongoing challenges in assessing reports relating to closed or private groups, particularly on encrypted messaging platforms. In these cases, analysts may not have direct access to the reported material and must rely on contextual information provided by the reporting party or collaborate with platforms and law enforcement authorities to assess the content.
Cooperation with law enforcement and industry
Throughout 2025, hotlines continued to work closely with law enforcement agencies, internet service providers, and online platforms to ensure that illegal material can be assessed and removed as quickly as possible.
Across the year, 343,252 reports were forwarded to law enforcement authorities and 129,666 reports were forwarded to internet service providers for further action. Hotlines also uploaded 3,403,749 reports into ICCAM, the secure platform that enables cross border exchange of reports between hotlines worldwide.
Supporting a safer digital environment
The findings from the 2025 Hotline Observatory highlight the essential role hotlines play within the European digital safety ecosystem. Through the identification, assessment, and reporting of illegal content, hotlines support faster removal of harmful material and strengthen cooperation between civil society, industry, and law enforcement.
As regulatory frameworks such as the Digital Services Act continue to take effect, the work of hotlines will remain central to ensuring a safer digital environment for children and young people across Europe.
- hotline child sexual abuse material (CSAM) child sexual exploitation (CSE)
Related content
- < Previous article
- Next article >