The move will also undo Meta’s restrictions on political content on its platforms.
Starting in the US, Meta is ending third-party fact checking on its platforms, opting instead for community notes, similar to X.
The move, which will phase in over the next few months, will remove guardrails provided by platform, instead allowing users to write and rate community notes.
Meta platforms have long been riddled with misinformation and disinformation – even with fact-checking measures in place. However, the Facebook and Instagram owner, in its announcement yesterday (7 January) argued that “too much” content on its platforms is being fact-checked that “people would understand to be legitimate political speech and debate”.
Once the changes set in, Meta will get rid of its fact-checking control, stop demoting fact-checked content and instead display warnings that need to be clicked through before a debated post is viewed. However, according to the announcement, the platform will opt to use “much less obtrusive” labelling that indicates additional information.
Moreover, in addition to removing third-party fact-checking, Meta, which tried cutting down political content on its platforms, said that it will undo those changes.
The platform’s founder and CEO Mark Zuckerberg pointed to the US election as a major influence on the company’s decision and criticised “governments and legacy media” for allegedly pushing “to censor more and more”.
Meta claims that its fact-checking has “gone too far”. According to the platform, it removed “millions” of posts daily in just December last year and argued that 10 to 20pc of these could have constituted as mistakes – or posts that didn’t violate its policies.
The platform introduced third-party fact-checking in 2016 as a reaction to increased misinformation and disinformation on social media, coinciding with Donald Trump’s first presidential win in the US. As part of the program, Meta funds fact-checking partners worldwide, including the Journal.ie in Ireland.
Moreover, following the 6 January 2021 Capitol hill riots in Washington, Trump was kicked out of Meta platforms, a move that angered the president-elect for years. Though he was reinstated in 2023.
Late last year, Zuckerberg dined with the president-elect at his Mar-a-Lago estate, and weeks later donated $1m to his inauguration fund.
When asked by reporters, Trump said that Meta’s abrupt shift in its fact-checking policies was “probably” motivated by his threats against Zuckerberg, which began after he was banned from Zuckerberg’s platforms.
Forrester’s principal analyst Kelsey Chickering said that the move might backfire for Meta. “Consumers already think social media platforms are riddled with fake news. If these policy changes result in platform experiences riddled with spam and hateful content, consumers might spend their time elsewhere.”
However, she said that unlike X, which lost advertisers after sweeping changes to content moderation were made by its new owner Elon Musk, Meta is a “much stronger paid media platform”.
“It offers unprecedented scale to advertisers … While it was fairly easy for many advertisers to say goodbye to X, the same won’t be true for Meta.”
Alongside Meta’s removal of content moderation guardrails, last week, the social media giant also announced the appointment of Joel Kaplan, a prominent Republican as the company’s global affairs VP to replace Nick Clegg, as well as adding Ultimate Fighting Championship (UFC) CEO Dana White and Exor CEO John Elkann to its board of directors.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.