EU questions YouTube, Snapchat and TikTok over algorithms

2 Oct 2024

Image: © ugiss/Stock.adobe.com

Under the EU Digital Services Act, very large online platforms have an obligation to identify and assess systemic risks linked to its services.

The European Commission today (2 October), requested YouTube, Snapchat and TikTok to share more information on their content recommendation algorithms and the role these systems play in amplifying risks to the platforms’ users.

The platforms need to submit the requested information by 15 November.

Under the EU Digital Services Act, companies designated as ‘very large online platforms’, such as YouTube, TikTok, Facebook and Snapchat among others, have an obligation to identify, analyse and assess systemic risks linked to its services, reporting to the Commission for oversight.

The platforms are also obligated to place mitigating measures around these risks.

The Commission today asked YouTube and Snapchat to provide detailed information on the parameters its algorithms or systems use to recommend content to its users as well as its role in amplifying risks related to the mental health of users, the protection of minors, the electoral process and civic discourse.

The Commission also requested information on how these platforms are mitigating the potential influence of their recommender systems on the spread of illegal content like hate speech and the promotion of illegal drugs.

Similarly, the Commission wants TikTok to provide information on the measures it has taken to avoid the manipulation of its service by bad actors and how it is mitigating risks that may be amplified by its recommender system.

Based on the responses provided by the platforms – which are are due in less than two months – the European Commission could formally open a non-compliance proceeding, investigating the platforms, or impose fines of up to 1pc of the company’s total annual income.

YouTube has had a history of containing extremist and harmful content, drawing criticism as a result. However, the issue that was rampant earlier was seemingly curtailed after stringent regulations were put into place. Research from last year however suggested that while YouTube might have addressed algorithm-influenced content ‘rabbit-holes’, it has been unable to eradicate extremist content and misinformation from its platform.

Earlier this year, the Commission opened formal proceedings against TikTok under the DSA to assess whether the platform breached regulations around the protection of minors and advertising transparency, as well as risk management of addictive design and harmful content arising from its recommendation system.

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Suhasini Srinivasaragavan is a sci-tech reporter for Silicon Republic

editorial@siliconrepublic.com