The decision comes several months after the platform’s CEO was arrested in France for shortcomings in content moderation.
Messaging platform Telegram has opted to join the UK’s Internet Watch Foundation (IWF) to proactively prevent child sexual abuse material (CSAM) on its app.
IWF said it has granted the platform membership to the group, which will give it access to the IWF’s datasets and technology to help tackle CSAM on the platform.
Derek Ray-Hill, interim CEO at the IWF said the move is a “transformational first step” for Telegram on what will be a much longer journey.
“Child sexual abuse imagery is a horror that blights our world wherever it exists. The children in these images and videos matter. I want to be able to say to every single victim that we will stop at nothing to prevent the images and videos of their suffering being spread online,” he said.
“Now, by joining the IWF, Telegram can begin deploying our world-leading tools to help make sure this material cannot be shared on the service. It is an important moment, and we will be working hard with Telegram to make sure this commitment continues and expands to the whole sector.”
As part of the agreement, Telegram will now use a range of IWF services, including taking IWF ‘hashes’ to instantly spot when this criminal content is being shared in public parts of the site. These are unique digital fingerprints of millions of known child sexual abuse images and videos.
IWF will also report directly into Telegram when CSAM is detected and work with them to remove it swiftly.
Telegram will also deploy tools to block non-photographic depictions of child sexual abuse, including known AI child sexual abuse imagery, as well as links to webpages known to be harbouring CSAM.
Remi Vaughn, head of press and media relations at Telegram, said the platform currently relies on reports and proactive moderation, which includes AI, machine learning and hash-matching.
“The IWF’s datasets and tools will strengthen the mechanisms Telegram has in place to protect its public platform and further ensure that Telegram can continue to effectively delete child abuse materials before they can reach any users.”
Telegram has been criticised in the past for taking a ‘hands-off’ approach when it comes to content moderation, being accused of creating a space to spread illegal and extremist content.
The IWF has previously confirmed thousands of reports of CSAM on Telegram since 2022 – including category A imagery, the most severe kinds of child sexual abuse. The group noted that when it reported this content to Telegram, it was removed by the platform.
However, the app’s approach to content moderation led to the arrest of its CEO, Pavel Durov, in France earlier this year. At the time, French authorities accused Durov of failing to take steps to stop criminal activity on the messaging app, including CSAM.
Prosecutor Laure Beccuau said the arrest was part of a broader investigation that was opened on 8 July against an unnamed person for charges of “complicity” in various criminal actions, including the possession and distribution of CSAM.
Following a release on bail, the Telegram CEO said the arrest was misguided but added that the platform would take more action against illegal content, which he claimed comes from a tiny proportion of its 950m users.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.