EU investigates Meta for ‘stimulating addiction’ in kids

16 May 2024

Image: © Rido/Stock.adobe.com

The bloc believes Meta ‘may exploit the weaknesses and inexperience of minors’ and cause addictive behaviours that reinforce the ‘rabbit hole effect’ in kids.

The EU is investigating whether Meta platforms Facebook and Instagram have algorithms that stimulate addictive behaviours in children.

In a statement today (16 May), the European Commission said it has launched an investigation into Facebook and Meta after concerns that the platforms may not create a safe and healthy environment for underage users, especially because of potentially lax age-verification measures.

“We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation. We want to protect young people’s mental and physical health,” said EU commissioner Margrethe Vestager.

“Today we are taking another step to ensure safety for young online users. With the Digital Services Act (DSA) we established rules that can protect minors when they interact online.”

The commission said the latest investigation is based on a preliminary analysis of the risk assessment report sent by Meta in September 2023, as well as the company’s replies to the body’s formal requests for information.

It believes Facebook and Instagram “may exploit the weaknesses and inexperience of minors” and cause addictive behaviour that reinforces a ‘rabbit hole effect’. It also believes Meta’s age-verification systems “may not be reasonable, proportionate and effective”.

“We are not convinced that [Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram,” said EU commissioner Thierry Breton.

“We will now investigate in depth the potential addictive and ‘rabbit hole’ effects of the platforms, the effectiveness of their age verification tools and the level of privacy afforded to minors in the functioning of recommender systems. We are sparing no effort to protect our children.”

Just last month, the commission opened an investigation into TikTok Lite, a scaled-down version of the popular app, following concerns that its Task and Reward feature could cause serious damage to mental health, particularly in children.

The commission also targeted Meta last month when it opened an investigation into the company for potentially violating EU rules through “deceptive advertising and political content” on Facebook and Instagram.

Late last year, former Facebook staffer turned whistleblower Arturo Béjar claimed that the tech giant is aware of the harm teenagers face on its platforms but has failed to act. In January, Meta introduced a new set of policies aimed at safeguarding teen users on its platforms. However, in a response to those updates, Béjar said the platform had not done nearly enough to protect children.

Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.

Vish Gain is a journalist with Silicon Republic

editorial@siliconrepublic.com