Hotline & Network Updates
Privacy, prevention, and protection using technology solutions
Non-Consensual Image (NCII) abuse can be perpetrated by anyone and affect anyone, irrespective of age and gender. But especially for young children and teenagers, the impact can be devastating. Therefore, our focus is to raise awareness of the solutions. Taking action is essential to protecting yourself and your peers online.
It is important to remember that the victim of NCII abuse is not to to blame for the abuse of the sexually explicit content, and after this has happened the focus must be placed on what can be done to get this content removed.
Resources & Tools for NCII removal
Below are a list of resources and tools available that help victims get the content removed as rapidly as possible:
- The National Center for Missing and Exploited Children (NCMEC) provides a free service "Take It Down" that can help you remove or stop the online sharing of nude, partially nude, or sexually explicit images or videos of minors below the age of eighteen - click here to report in the United States.
- Childline and the Internet Watch Foundation's Report Remove tool allows young people to report a nude video or image shared online to be removed. This tool provides a child-centred approach to image removal which can easily be done online - click here to report in the United Kingdom.
- The Revenge Porn Helpline (RPH) developed by StopNCII.org, is a free tool designed to support victims of Non-Consensual Intimate Image (NCII) abuse. The tool generates a hash value from an intimate image. Duplicate copies of the image all have the exact same hash value. By sharing the hash value with participating companies they can help to detect and remove images from being posted or shared online - click here to report worldwide.
New tech, new legislation and a new perspective!
Firstly, when we consider what the online safety space needs we have to utilise the type of technology that benefits users. For example, StopNCII is currently being implemented by Facebook, Instagram, TikTok, and Bumble. RPH have successfully removed over 200,000 individual non-consensual intimate images from the internet. This is a removal rate of 90%, which shows the value and benefit of giving victims of NCII abuse the ability to utilise this type of technology. All online platforms that facilitate media sharing should consider how they address NCII abuse and understand the negative impact it can have on its users.
Secondly, let's address the discrepancies across EU Member States which you can read in our legislative review of EU hotlines and how the legality of CSAM depends on the content and context. In fact, most member states do not have a legal definition of 'sexualised' and the legality of “sexualised” images of children depends largely on the context in which the image was taken, the content itself, and in some cases the intention of use by the creator of those images. The lack of legal coverage for this type of content may post a challenge when trying to combat NCII abuse.
And finally, regardless of if we call it Self-generated Child Sexual Abuse Material (CSAM) or Non-Consensual Intimate Image (NCII) abuse, this type of child sexual abuse and exploitation material is a societal issue of extreme concern. A lack of action in this space is no longer a lack of understanding — research is now available showcasing the impact of NCII abuse on victims (ECPAT, 2021),the detrimental impact on their mental health, and sometimes even their financial health. The victim lives in constant fear of the content being shared among the victim's network and/or the risk of revictimisation each time this content is viewed. NCII abuse can cause extreme feelings of shame, anxiety and helplessness, which in turn perpetuates the cycle of abuse.