INHOPE | Patreon's notice & takedown procedure
Article
Partner Updates

Patreon's notice & takedown procedure

Notice and Takedown procedures are an integral part of digital safety. The ways in which platforms treat reported content directly affect how often users flag illegal and inappropriate material. Ensuring a safe and positive reporting experience is critical in combating harmful material online.

Patreon is one of the many platforms that has partnered with INHOPE in a combined effort to eradicate CSAM. We talked to Liliana Ferreira, Patreon's Senior Content Moderation Specialist to learn more about the way they treat inappropriate and abusive material on their website.

On Patreon, just like on most other social media and membership platforms, users have the possibility to report inappropriate or illegal content for review. We asked Liliana about their internal process after they receive a report.

What happens after a user makes a report?

"Once a creator is reported, we go through a thorough review process of the creator page and any socials connected or funded through Patreon. If the content that the creator is sharing is deemed inappropriate we will proceed with either working with the creator to get the content removed or, in case the content is identified as child sexual abuse material (CSAM), we will immediately remove the creator and report it to NCMEC." - Liliana Ferreira

How long does this process normally take?

"The Content Moderation team has a resolution and response time that differs from policy to policy based on the urgency of the matter. When it comes to Minor Safety, we prioritize those cases above others. We try to get through the cases as fast as possible, with a turnaround time of 12 hours. We have teammates working in both Europe and the USA, which means that both time zones are covered and those cases are handled fastest." - Liliana Ferreira

Most platforms work with a combination of AI and human T&S agents to detect illegal content on their websites. What makes Patreon's approach special is that all content that is reported is reviewed by a human moderator.

"What differentiates us is that all of these reports are reviewed by a human moderator who has the final decision on whether we should remove the content and/or the creator and then communicates directly with law enforcement if necessary." - Liliana Ferreira

While this particularly helps to ensure a high-quality takedown procedure, reviewing illegal content is a very straining task for content moderators. For the protection of the moderators, only dedicated trained specialists are responsible for reviewing material that was flagged for Minor Safety or CSAM. Furthermore, Patreon offers all content moderators monthly wellness sessions.

Do you inform users when their content is reported?

"The users are always alerted if they are violating our Community Guidelines and told exactly which guideline is violated by what content. We always reach out to the user directly via the email registered to their Patreon page. All users who got suspended will see a banner at the top of their page when they attempt to log in." - Liliana Ferreira

One of the bigger challenges with keeping platforms safe is making sure that flagged users do not create backup accounts. We asked Liliana Ferreira how Patreon ensures that the same offenders do not come back to their platform.

How does Patreon ensure that flagged users do not come back?

"Once a user is removed the only way to come back to our platform is through a removal appeal. As we do not allow users that violated our Minor Safety policies to come back to our platform, we have tools that allow us to identify new accounts being created by that same user. We also monitor creators that might represent a future risk by keeping track of new content uploads and the direction of the project." - Liliana Ferreira

While Patreon has a system in place to scan users for previous violations, it cannot prevent offenders from creating accounts on other platforms, which is why it is important to encourage tech companies to share their data and resources with each other. Patreon is one of the companies already participating by partnering with organisations like INHOPE and NCMEC. You can read about their take on the importance of cross-platform sharing here.

As the internet is constantly evolving, so is illegal content. Laws and regulations often cannot keep up with the speed of the changing digital environment, which is why it is important that social platforms are constantly updating their policies based on the newest developments.

How do you deal with grey zones?

"When we are moderating content, we are always conscious that we are working with creators who depend on transparency and fairness for their livelihoods. We are constantly monitoring for issues and content that may lead to people being put in harm’s way and working with our policy and legal teams and creators to make sure our policies serve the community. The policies are always evolving in collaboration with creators, and we pay close attention to new trends. By keeping our policies updated we can act upon content that violates our guidelines. Once a report is received, we review the page not only for what was reported but for other possible violations as well." - Liliana Ferreira

User reports are a crucial part in keeping platforms accountable in reviewing and removing inappropriate and illegal content. This is why it is important that platforms ensure that their notice and takedown procedure is as transparent as possible. Users need to be able to flag content in a fast and easy manner and need to be informed about the status of their report.

Find out more about the process of getting CSAM removed here.

If you come across inappropriate content or CSAM on a social or membership platform please do your part by reporting it as fast as possible.

Patreon's notice & takedown procedure
03.06.2022
Photo by INHOPE
'

As we do not allow users that violated our Minor Safety policies to come back to our platform, we have tools that allow us to identify new accounts being created by that same user

'