Mastodon is full of child abuse material, study claims

25 Jul 2023

Image: © cryptoFX/Stock.adobe.com

Researchers at the Stanford Internet Observatory claim decentralised platforms like Mastodon have difficulties in detecting and reporting CSAM content.

The social media platform Mastodon is being used to spread massive amounts of child sexual abuse material (CSAM), according to a new study.

The analysis by the Stanford Internet Observatory (SIO) claims that 112 matches of known CSAM were found on Mastodon over a two-day period. This investigation also found nearly 2,000 posts that used common hashtags linked with sharing this type of content.

This content was found by looking through the local public timelines of the top 25 accessible Mastodon instances, according to the researchers.

Decentralised social media platforms like Mastodon have been hailed by supporters as an alternative to massive, centralised sites like Twitter and Facebook.

These decentralised platforms – collectively known as the Fediverse – have certain advantages such as greater privacy and more control for users. But the Stanford researchers said these sites also present issues around content moderation.

Centralised platforms such as Facebook and TikTok have staff members moderating content to prevent the spread of harmful and illegal content. But sites like Mastodon use individual servers, which each have their own code of conduct, terms of service, privacy options and moderation policies.

The Stanford researchers said these platforms typically rely on volunteer administrators to moderate content and enforce guidelines, which can present issues in tackling certain content.

“There are few technical measures available or dedicated experts to set rules and handle content moderation in the Fediverse for imagery of violence or self-harm, child abuse, hate speech, terrorist propaganda or misinformation,” the researchers said.

“Bad actors tend to go to the platform with the most lax moderation and enforcement policies. This means that decentralised networks, in which some instances have limited resources or choose not to act, may struggle with detecting or mitigating [CSAM].”

The study lists several recommendations to improve the reporting of CSAM on Fediverse sites like Mastodon, such as better moderation tools and collectively blocking certain hashtags and keywords that are used to spread this content.

Larger social media sites still have issues with CSAM however. Last month, a report by the SIO listed Instagram as the “primary platform” for the spreading of CSAM online. This report also claimed that Twitter had an “apparent and now resolved regression” which briefly allowed CSAM content to be posted on public profiles.

In March, ISD senior analyst Ciarán O’Connor spoke about how the Fediverse is used to store and spread extremist content, along with the difficulties presented in tackling this content.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com