77% of CSAM assessed in 2020 depicted children under 13 years of age.
93% of victims depicted in reports assessed during 2020 were girls and 5% were boys.
Netherlands - 21%
Austria - 21%
United Kingdom - 12%
Canada - 10%
Colombia - 9%
Germany - 5%
Poland, United States, Ireland; each 3%
Czech Republic, Finland, each; 2%
The pie chart shows the variety of sites used to store CSAM.
Number of exchanged content URLs: 1038268
Number of illegal content URLs: 267192
CSAM stands for Child Sexual Abuse Material.
Hosting provider: refers to companies that host internet content
ISP stands for Internet Service Provider, which means organisations that provide connectivity to the internet. This also refers to ESP’s (Electronic Service Providers)
INHOPE’s secure software solution to collect, exchange and categorise reports of child sexual abuse material. ICCAM is used by INHOPE hotlines in different jurisdictions (countries) and INTERPOL. The name ICCAM is derived from the phrase ‘I see Child Abuse Material.’
A hotline enables the public to anonymously report online material they suspect may be illegal. A hotline analyst will investigate the report and if confirmed illegal, they act to have the content removed from the internet as rapidly as possible.
A ‘Report’ (when we talk about a report to a hotline) is a URL that has been reported to a hotline by a member of the public or industry that contains potentially illegal images or videos. One report can contain an unlimited number of images and videos. Often a single report can have a thousand CSAM items.
Assessed images & videos: In order to determine the illegality of images and videos on a particular URL, an analyst has to review the content that is visible on the reported URL. Assessed images and videos refers to all the images and videos that have been found on the reported URL.
Content (CSAM) removed is the time stamp recorded on ICCAM when a hotline confirms that the instance of the image and/or video has been removed from the internet.
Forwarded ICCAM Reports: When a hotline receives a report from the public or industry, an analyst reviews the report to determine whether it is illegal. If deemed to be illegal CSAM, the analyst traces its hosting location and then instantly forwards the URL to the hotline in the hosting country. The forwarding takes place within ICCAM (INHOPE’s secure platform).
‘Illegal images & videos’ refers only to content that has been classified as illegal by an INHOPE member hotline
LEA stands for Law Enforcement Agency
A Notice and Takedown order is a procedure for asking a Hosting Provider (HP) or search engine to
immediately remove or disable access to illegal, irrelevant or outdated information hosted on their
A review of 2020, where we look at the INHOPE Network of Hotlines, the environment that it operates in and its global impact on CSAM.
For older reports - click here.
The Terminology Guidelines, dubbed the ‘Luxembourg Guidelines’ after their adoption in Luxembourg earlier this year, offer guidance on how to navigate the complex lexicon of terms commonly used relating to sexual exploitation and sexual abuse of children. They aim to build consensus on key concepts in order to strengthen data collection and cooperation across agencies, sectors and countries.
The Technology Working Group was charged with examining the role of technology in combatting the proliferation of online child sexual exploitation and abuse imagery.A focus of this report is on the need to dismantle the chief technical, legal and policy silos that are frustrating real collaboration among law enforcement, industry, government and the non-government sector.
Child Sexual Abuse Material (CSAM) is the evidence of a crime that has taken place. Investigators and victim identification specialists use this evidence to work out information about th (...)
On September 2nd, Hotline.ie, the Irish national centre combatting illegal content online (est. 1999), has launched a new online service and reporting portal aimed at helping young peopl (...)
Peer-to-peer networks, commonly referred to as P2P networks, are a network of computers in which resources can be shared without requiring a separate server computer. All computers in th (...)
In June 2021, the Australian Government’s Parliamentary Joint Committee on Law Enforcement agreed to inquire into, and report on, law enforcement capabilities in relation to child expl (...)
Within the framework of its Digital Citizenship Programme, Telefono Azzurro will shortly present the results of new research carried out in cooperation with Doxa Kids on children’s rig (...)
The Film and Publication Board (FPB), content regulator in South Africa, notched up another successful conviction of a Child Sexual Abuse Material suspect. Working with law enforcement, (...)
Law Enforcement has more time to focus on victim identification and rescue when they use AviaTor. A tool that guarantees prioritisation of NCMEC reports and streamlines the process for o (...)
The ESCAPE project, funded by End Violence Against Children (EVAC) represents INHOPE's vision to put an eco-system in place ensuring that every industry stakeholder and each member of th (...)
Launched in 2015 by the WePROTECT Global Alliance, the Model National Response provides countries with a theoretical framework for coordinating a national response to online child sexual (...)
In our recently released Annual Report, we stated that websites have replaced image hosting sites as the site type on which INHOPE member hotlines most commonly find Child Sexual Abuse M (...)