Introduction
The proliferation of online child sexual abuse and exploitation presents a serious threat to children globally. In 2023, the US National Center for Missing and Exploited Children (NCMEC) received over 36.2 million reports of suspected child sexual exploitation, which represents a 72% increase over the number of reports received in 2020, when NCMEC received more than 21 million reports.
One of the forms of sexual violence against children that has increased dramatically in recent years is the creation, distribution, and viewing of child sexual abuse material (CSAM). This is, images, videos, live-streaming and any other material depicting real or simulated child sexual abuse and/or exploitation. This type of sexual violence against children is particularly traumatic for the victims, since it contributes to their revictimization every time the material is viewed and/or shared. Such material consists of a permanent record of the abuse and can have long-lasting consequences on victims and survivors.
Recent research conducted by Protect Children reveals that CSAM is easily accessible on the surface web, particularly on pornography websites and social media platforms.v This research has shown that offenders are using popular social media platforms and encrypted messaging applications to search for, view and disseminate CSAM. Furthermore, perpetrators are misusing and abusing different technologies and platforms, such as Artificial Intelligence (AI) or Extended Reality (XR), not only to create and share CSAM, but also to sexually abuse and exploit children online in other forms, such as grooming or sexually extorting children.vii Hence, emerging technologies are facilitating online harms and resulting in new and even worse forms of online child sexual abuse and exploitation.
This Statement analyses technological abuses for child sexual abuse and exploitation, answering Questions 1, 3, 4 and 7 of the call for input. Subsequently, in Annex 1, we provide actionable recommendations directed at relevant stakeholders from the tech industry, among other actors, to prevent the proliferation of CSAM on the Internet and effectively address online child sexual abuse and exploitation, thus answering Questions 2, 5, 7, 8 and 9 of the call for input.