top of page

A One-Year 28% Increase in Reports of Online Child Sexual Exploitation - A Good Thing?

Updated: Jun 8, 2022


National Center for Missing and Exploited Children Cyper Tipline Reports
National Center for Missing and Exploited Children

Short answer - yes, but only in the sense that more platforms are getting better at proactively detecting, removing, and reporting the online exploitation of children, including child sexual abuse material (CSAM), child sex trafficking and online enticement. When platforms don’t report the online exploitation of children, it’s not because it isn’t ongoing - it means it’s simply not being detected. Additionally, lower numbers of reports relative to a platform’s userbase may signal the online exploitation of children is being discovered inadvertently and only reported reactively. Reports need to continue to go up until every platform is proactively detecting the online exploitation of children at scale. More reports mean there is more detection, and more detection can change lives. For now, a yearly rise in reports is an indicator the anti-human trafficking community’s collective efforts are working. Hopefully, the day will soon come when reports start to come down because there is universal adoption of proactive online exploitation of children detection and removal.

Every piece of CSAM is the documentation of a digital crime scene. Behind every CSAM file is a child victim in need of support, or a survivor whose trauma continues to spread online. Technology enables the sexual abuse of children and Thorn has learned that if there’s an upload button on a platform, it will be used to host CSAM. This occurs regardless of the size of the platform, or its intended use. That is one reason why there is a direct correlation between the increased use of technology and increased CSAM.

The National Center for Missing and Exploited Children (NCMEC) received 21.7 million reports about multiple forms of online child sexual exploitation via its CyberTipline in 2020, an increase of 28% from 2019. The number of reports increased for every category.

In the United States, only NCMEC and law enforcement are authorized to maintain known CSAM files in the service of finding those children, investigating cases and prosecuting offenders. In some cases, a single report has led to the removal of children from harm. To eliminate CSAM from the internet, we have to identify more content.

U.S.-based Electronic Service Providers (ESP) report instances of apparent child pornography that they become aware of on their systems to NCMEC’s CyberTipline. To date, over 1,400 companies are registered to make reports and in 2020, 21.4 million of the 21.7 million total reports were from ESPs. Higher numbers of reports can be indicative of a variety of things including larger numbers of users on a platform or how robust an ESP's efforts are to identify and remove abusive content from their platforms. These reports are critical to helping remove children from harmful situations and to stopping further abuse.

NCMEC is actively engaged in policy discussions with many countries to ensure that child sexual abuse detection tools are allowed to remain in use. Universal adoption of proactive detecting, removing, and reporting online exploitation of children is how we’ll build a world where every child can simply be a kid. That’s how we’ll ensure sexually abused children are identified and rescued in days or hours instead of months or years.



43 views0 comments

Recent Posts

See All
bottom of page