Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of child sexual abuse material (CSAM). Underlying every sexually explicit image or video of a child is abuse, rape, molestation, and/or exploitation. The production of CSAM creates a permanent record of the child’s victimization. Due to rapid technological changes, online child sexual exploitation offenses are increasing in scale and complexity. Individuals who seek to sexually exploit children through CSAM can do so from anywhere in the world by using digital devices and the internet. Modern smartphones are the ideal child exploitation tool for offenders, as they can be used to photograph, record, or watch live child sexual abuse; store CSAM on the device; access CSAM stored remotely; connect with victims and other offenders; and distribute and receive CSAM, through an endless variety of applications. The device itself and the applications often cloak this criminal activity with encryption.
The market for CSAM among individuals with a sexual interest in children drives the demand for new and more egregious images and videos. The push for new CSAM results in the continued abuse and exploitation of child victims, and the abuse of new children every day. When these images and videos are posted and disseminated online, the victimization continues in perpetuity. Children often suffer a lifetime of re-victimization knowing the documentation of their sexual abuse is on the internet, available for others to access forever.
Increasingly, perpetrators are grooming minors to engage in sexually explicit conduct online. This is distinct, but related, to CSAM produced in person by offenders. Offenders engaged in either type of production have been known to take advantage of multiple vulnerabilities of a child, including a minor’s fear of getting in trouble with their parents or guardians, school, or law enforcement. This can result in the minor being extorted or blackmailed to create additional CSAM, or pay a ransom, to prevent images from being distributed to their peer networks.1 Offenders tell victims they will call the police and the victims will get in trouble for the sexually explicit content they have already created and sent the offender. Even families who have become aware of the issue have been concerned the child will get into trouble with law enforcement and may not report the crime, preventing investigators from identifying and stopping the offender.
National Center for Missing & Exploited Children: Child Sexual Abuse Material Case Requests from Law Enforcement in 2020
CSAM is readily available through virtually every internet technology, including social networking platforms, file-sharing sites, gaming devices, and mobile apps. This has led to unprecedented growth in the volume of reports submitted to the CyberTipline operated by the National Center for Missing & Exploited Children (NCMEC). The CyberTipline provides a single interface where private citizens and companies, such as Electronic Service Providers (ESPs), can report suspected online child exploitation. From 2013 to 2021, the number of CyberTipline reports received by NCMEC skyrocketed from 500,000 to almost 30 million. On three occasions in this time span, the volume of CyberTipline reports doubled or nearly doubled from one year to the next. In 2015, the number of CyberTipline reports (4.4 million) was four times greater than the prior year.3 In 2021, the nearly 30 million CyberTipline reports received by NCMEC constituted an overall increase of approximately 35% from the 2020 total (almost 22 million).4 Though only one datapoint from one country, NCMEC CyberTipline report numbers are evidence of the staggering global scale of CSAM online.
Data from the Canadian Centre for Child Protection (C3P) paints a similar picture. C3P operates Project Arachnid, an innovative tool to combat the growing proliferation of CSAM on the internet. Project Arachnid’s platform crawls links on sites on the open web to look for publicly available CSAM.5 Once such imagery is detected, a notice requesting removal is sent to the provider hosting the content. Since Project Arachnid’s launch in 2016 until October 1, 2021, over nine million notices have been sent to providers about CSAM detected on their platforms.6 However, Project Arachnid numbers largely center on CSAM stored or traded online. On the Dark Web, where anonymity and encryption make it harder to trace CSAM perpetrators, a single active website dedicated to the sexual abuse of children had over 2.5 million registered users as of June 2021.