Removing online misinformation may do more harm than good, report claims

25 Jan 2022

Image: © PheelingsMedia/Stock.adobe.com

Scientists at the UK’s Royal Society argue that removing online misinformation may drive people towards ‘harder-to-access corners of the internet’.

As social media companies such as Twitter and Facebook look to crack down on misinformation on their platforms, a leading group of UK scientists has made an unconventional suggestion: that removing online misinformation may do more harm than good.

Events such as the Covid-19 pandemic and the US Capitol riots last January flung online misinformation into the sphere of public debate, with many agreeing that misleading or inaccurate info, deliberate or otherwise, should be removed from public platforms.

Last month, Meta removed six disinformation networks on its platform, including some that propagated false claims and conspiracy theories around the origin of Covid-19.

However, scientists at the Royal Society in the UK have investigated the phenomenon of online misinformation and concluded that “censoring or removing inaccurate, misleading and false content is not a silver bullet and may undermine the scientific process and public trust”.

‘More harm than good’

In a report titled The Online Information Environment, the Royal Society scientists pointed out that there is a risk content removal may “cause more harm than good” by driving misinformation content – and people who may act upon it – towards “harder-to-address corners of the internet”.

While banning illegal and harmful content such as hate speech, terrorism and child sexual abuse is an effective measure, the report said there is “little evidence” to support the effectiveness of removing scientific misinformation. Instead, it suggests addressing the amplification of misinformation.

“Clamping down on claims outside the consensus may seem desirable but it can hamper the scientific process and force genuinely malicious content underground,” said Frank Kelly, a professor of mathematics of systems at the University of Cambridge, who chaired the report.

The report featured a survey of people in the UK, which found that “the vast majority of respondents” believe Covid-19 vaccines are safe, that human activity is responsible for the climate crisis and that 5G technology is not harmful – topics that are often the subject of online misinformation.

The majority of respondents also said that the internet has improved their understanding of science, that they are likely to fact-check any suspicious claims they come across online and that they “feel confident to challenge” friends and family on scientific misinformation.

Scientists at the Royal Society also claimed that the existence of echo chambers – social ‘bubbles’ where people only come across content they already agree with – is “less widespread than may be commonly assumed”.

The role of technology

While removing online misinformation may not be a silver bullet, according to the report, technology can play an important role in addressing it – especially when it comes to rapidly detecting harmful content and its origin.

Provenance-enhancing technology, which is used to find information on the origin of online content and how it may have been altered, has been found to “show promise” and is expected to become increasingly important as the misinformation debate rages on.

The report also recommended that social media platforms should establish “ways to allow independent researchers access to data in a privacy compliant and secure manner”, for more transparent and independent assessments of the effectiveness of measures.

“Removing content and driving users away from platforms which engage with scientific authorities risks making this harder, not easier, to achieve,” the report said. “A more nuanced, sustainable and focused approach towards misinformation is needed.”

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Vish Gain was a journalist with Silicon Republic

editorial@siliconrepublic.com