INHOPE - Association of Internet Hotline Providers | Webinar Recap: Proactive safety for gaming communities - intelligence, investigations and everything in between
Article
Events & Campaigns

Webinar Recap: Proactive safety for gaming communities - intelligence, investigations and everything in between

Diba Arjmandi, Child Exploitation Intelligence Lead at Roblox and Danielle Williams Lead Subject Matter Expert for Child Safety at Resolver Trust & Safety, joined INHOPE to explore how intelligence and investigations play a growing role in protecting children on gaming platforms and beyond.

Gaming environments are fast-moving, interactive, and often live, making moderation and risk detection particularly complex. Yet, these same challenges have driven gaming platforms to lead the way in trust and safety, continuously innovating and adapting to safeguard their users.

Moderation & Intelligence

While moderation focuses on enforcing clear policy violations, intelligence work goes deeper by analysing context, tone, and intent to detect emerging risks that might not yet be covered by guidelines. On Roblox, where image sharing is restricted, the safety team watches for behavioural signals such as shifts in language, spikes in moderation activity, or unusual patterns to identify harm. This type of nuanced analysis makes human review indispensable, Diba explained.

"While automation offers significant support, human review remains essential, particularly in these very nuanced and borderline cases requiring contextual understanding and judgment."- Diba Arjmandi

Together, moderation and intelligence teams work in tandem. Intelligence often uncovers early warning signs that moderation alone might miss, helping platforms to respond more effectively.

Identifying Risks Beyond the Platform

Many risks do not originate on the platform itself but stem from wider culture, trends, and other apps. Emojis, slang, and in-game behaviours evolve rapidly, and what was once an innocuous comment or symbol can quickly develop into a possible signal of abuse.

"We can't assume we always know how kids are communicating. That's why intelligence gathering should happen in different stages." - Diba Arjmandi

This continuous adaptation helps platforms stay ahead of new forms of harm and identify potential abuse before it happens.

Finding the Unknown Unknown

Intelligence is no longer only used by law enforcement but is now central to how platforms protect their users. Danielle Williams, who has a background in policing, highlighted how intelligence functions differently in tech environments. The biggest difference is scale, Danielle shared. Even a small improvement in platform safety can benefit every child worldwide.

“If you’ve made a platform just a little bit safer, you’ve done that for every child in the world.” - Danielle Williams

Rather than reacting to harm after it happens, intelligence enables platforms to anticipate risks and build safer systems from the start. Diba shared that her team collaborates closely with engineers, policymakers, and legal experts to turn insights into practical actions. These may include refining policies, improving moderation tools, or flagging new risks for further review.

“Finding the unknown unknown is an ongoing process of vigilance and adaptation. It requires a combination of technological sophistication, behavioural analysis, and, of course, a commitment to continuous learning and improvement.” - Diba Arjmandi

Practical Advice for Trust and Safety Teams

  • Balance proactive and reactive approaches: use known reports to guide deeper investigations and look ahead.
  • Monitor signals beyond the platform: online harm does not happen in isolation, so pay attention to trends in pop culture, news and social media.
  • Work across teams: intelligence is most effective when shared with engineering, legal, moderation and policy groups.
  • Stay flexible: watch for changes in language, behaviour and search trends, especially where patterns shift rapidly.
  • Share knowledge beyond your platform: partner with NGOs, law enforcement and industry peers to strengthen the wider safety ecosystem.

Why Collaboration is Key

Both speakers agreed that no single team or company can tackle these challenges alone. Protecting children online requires sharing information not only across internal departments but also across the wider ecosystem. That includes NGOs, law enforcement, academia, and industry partners.

"There's so many inspiring people around us who bring such unique, amazing sets of skills. Everyone here actually wants to make a difference in someone else's life." - Diba Arjmandi


Missed this webinar? Watch the full recording in our Watch the full recording in our Webinar Playlist Save the Date for our last webinar. Check out the full schedule.

Webinar Recap: Proactive safety for gaming communities - intelligence, investigations and everything in between
28.05.2025
'

This webinar was recorded and is available online. To watch it, visit our Webinar Playlist.

'