INHOPE | Understanding the intentions of Child Sexual Abuse Material (CSAM) sharers on Facebook
Article
Partner Updates
Industry News & Trends

Understanding the intentions of Child Sexual Abuse Material (CSAM) sharers on Facebook

Whenever Child Sexual Abuse Material (CSAM) is shared, the child depicted in it is revictimized. The reasons that an individual shares CSAM doesn't change this. However, in recognising that people share CSAM for many different reasons, we stand a better chance of being able to stop them.

Facebook, valued partner of INHOPE since 2015, has conducted research into the different reasons people share CSAM on their platform in order to better tailor how they respond, and to develop more targeted interventions for prevention. They also hope this research will enable them to provide additional context in the reports they make to NCMEC and law enforcement enabling more effective triaging of cases, and the quick identification of children who are presently being abused.

Draft Taxonomy of Intent

Their research involved consultations with INHOPE member hotline the National Center for Missing and Exploited Children (NCMEC) and Professor Ethel Quayle. They also drew heavily on existing research in the field, such as and Ken Lanning’s work in 2010 and what their own child safety investigative team’s see on their apps.

The insights have been used to develop a draft taxonomy of intent broken down into malicious and non-malicious users.

Malicious users are those who Facebook believe intended to harm children with their behaviour, and include:

  • Preferential Offenders: who are motivated by sexual interests exclusively felt for children (i.e. paedophiles/hebephiles).
  • Commercial Offenders: who facilitate child sexual abuse for financial gain
  • Situational Offenders: who take advantage of opportunities to engage with CSAM and minors, and who may have a range of paraphilic desires.

Non-malicious users are those whose behaviour is problematic and potentially harmful, but Facebook believe did not intend to cause harm to children. They include:

  • Unintentional Offenders: who shared CSAM out of humour, outrage, or ignorance.
  • Minor Non-Exploitative Users: who are children themselves engaging in developmentally normative behaviour, and is not inherently exploitative
  • Situational “Risky” Offenders: who habitually consume and share adult sexual content, and who come into contact with and share CSAM as part of this behaviour, potentially without awareness of the age of subjects in the imagery.

In publishing this taxonomy, Facebook stress that “[w]hile this study represents our best understanding, these findings should not be considered a precise measure of the child safety ecosystem”.

Application

Facebook evaluated 150 accounts that they reported to NCMEC for uploading CSAM in July and August of 2020 and January 2021, and estimate that more than 75% of these did not exhibit malicious intent (i.e. did not intend to harm a child), but appeared to share for other reasons, such as outrage or poor humour.

“Our work is now to develop technology that will apply the intent taxonomy to our data at scale. We believe understanding the intentions of sharers of CSAM will help us to effectively target messaging, interventions, and enforcement to drive down the sharing of CSAM overall.” - John Buckley, Malia Andrus, Chris Williams

Read about what another of INHOPE's partners, Microsoft, is doing to tackle Child Sexual Abuse Material here.

Understanding the intentions of Child Sexual Abuse Material (CSAM) sharers on Facebook
18.03.2021 - by INHOPE
'

If you'd like to read more articles like this, then
click here to sign up for INHOPE Insights and Events.

'