Studies suggest Facebook feed may not change our political beliefs

28 Jul 2023

Image: © Vladislava/Stock.adobe.com

Nick Clegg, chief of global affairs at Meta, said the latest ‘groundbreaking’ studies change some commonplace assertions about the impact social media has on people.

Four new studies have cast doubt on the assertion that social media platforms such as Facebook and Instagram substantially affect people’s political beliefs.

Published yesterday (27 July) in leading journals Nature and Science, the studies conducted by US research institutions with cooperation from Meta looked at the impact of algorithms used by Facebook and Instagram on political attitudes and behaviours during the 2020 US elections.

Based on the findings, it has emerged that while exposure to content from like-minded sources on social media is common, reducing its prevalence during the 2020 US presidential election “did not correspondingly reduce polarisation” in beliefs or attitudes, as one study notes.

Another looked at data from 208m Facebook users in the US and analysed their news exposure during the elections to conclude that there is a strong segregation in terms of ideology, with a “substantial corner of the news ecosystem” consumed almost exclusively by conservatives.

The research found that more than 97pc of the links to news stories rated as false by fact checkers on the apps during the 2020 election drew more conservative readers than liberal readers. However, sources favoured by conservative audiences were found to be more prevalent on Facebook’s news ecosystem than those favoured by liberal audiences.

“Algorithms are extremely influential in terms of what people see on the platform, and in terms of shaping their on-platform experience,” Joshua Tucker, co-director of the Center for Social Media and Politics at New York University, told The Washington Post.

“Despite the fact that we find this big impact in people’s on-platform experience, we find very little impact in changes to people’s attitudes about politics and even people’s self-reported participation around politics.”

A first set of 16 studies in the pipeline, the research was co-led by Tucker and Prof Talia Jomini Stroud, founder and director of the Center for Media Engagement at the University of Texas at Austin. Stroud and Tucker selected 15 additional researchers to collaborate on this.

Unsurprisingly, Meta hailed the studies as “groundbreaking” and said they add to a “growing body of research” that shows there is little evidence to suggest Facebook and Instagram cause harmful polarisation or have “meaningful effects” on political beliefs.

“Does social media make us more polarised as a society or merely reflect divisions that already exist?” asked Nick Clegg, Meta president of global affairs, in a blog announcing the studies.

“The papers published today include studies of the effects of algorithmic ranking and virality, the prevalence and effects of like-minded information exposure on Facebook, and ideological segregation in exposure to news,” Clegg wrote. “They challenge the now commonplace assertion that the ability to reshare content on social media drives polarisation.”

The studies were published just a day after Facebook surpassed 3bn monthly active users for the first time Meta’s latest earnings quarter. The company’s family of apps, which includes Instagram, WhatsApp and Threads, had 3.88bn users in the latest quarter – up by 6pc.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.

Vish Gain was a journalist with Silicon Republic

editorial@siliconrepublic.com