The oversight board said the controversial ‘cross-check’ system has led to an unequal treatment of users on Facebook and Instagram, while allowing potentially harmful content to stay online for longer.
Meta’s independent oversight board has urged the company to improve content moderation and “radically increase transparency” around its treatment of high-profile users.
It follows a review into the company’s two-tier moderation system, in which content from VIP accounts is flagged in a separate ‘cross-check’ programme.
This review followed a report by the Wall Street Journal in September 2021, which claimed these high-profile users are not subject to the same rules and enforcement actions as normal users on Facebook and Instagram.
Meta said the cross-check programme is its attempt to not accidentally remove content that does not violate its policies.
Normally when content is flagged for breaching Meta’s policies, it is immediately removed by either Facebook or Instagram’s content review system. However, when a VIP’s content is flagged for potential rule-breaking, it is not immediately removed but is instead kept up until a human can review it.
The oversight board concluded that this programme appears designed to “satisfy business concerns”, rather than Meta’s claim that it aims to advance “human rights commitments”.
“The board understands that Meta is a business, but by providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm,” the oversight board said.
“We also found that Meta has failed to track data on whether cross-check results in more accurate decisions, and we expressed concern about the lack of transparency around the programme.”
The board said cross-check has led to an unequal treatment of users, as ordinary people on these social media sites are “much less likely to have their content reach reviewers” if a mistake has been made.
The delayed approach to removing content from VIPs was also noted as a concern for the board, as the cross-check programme has “operated with a backlog which delays decisions”, leading to potentially harmful content staying up for longer.
To address these issues, the board has made 32 recommendations to Meta around how it can correct the most high-impact errors on Facebook and Instagram in a way that is structured “substantially differently”.
These recommendations include increased transparency around the cross-check system, hiding potentially harmful content that is marked for review and prioritising “expression that is important for human rights”.
The independent oversight board was formed as an external review group to hold the company to account on content matters and policies. Its members include legal experts, digital rights experts, academics, a Nobel Peace Prize laureate, journalists and former politicians.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.