Facebook Cross-check criticised as unfair, under-resourced and opaque

Occurred: December 2022

Facebook's Cross-check system was taken to task by Meta's Oversight Board for unfairly providing VIP users with less onerous rules than everyone else and having a poorly resourced and opaque system to manage them.

The Oversight Board found that the Cross-check system results in users being treated unequally, that it lead to substantial delays in taking down rule-violating content (on average, decisions took more than 5 days, with some cases taking up to 7 months), and it appears to be structured more to satisfy the company's commercial objectives rather than to advance Meta's human rights commitments.

It also found that while only 9 percent of Facebook's daily active users are from the US and Canada, 42 percent of content reviewed under Cross-check came from these two countries, and that Facebook had failed to provide crucial details about Cross-check to the Oversight Board, including criteria for adding accounts to the system.

The Oversight Board made 32 recommendations to improve the Cross-check, including developing clearer criteria for account eligibility and making these public, allowing individuals to apply for the programme proactively, visually communicating an account's Cross-check status to users, increasing resources to ensure the timely review of flagged content, and prioritising posts important for human rights and considered of special public importance.

March 2023. Meta agreed to the Oversight Board's recommendation to publish regular Transparency Reports on Cross-check and to limit the distribution of content from high-profile individuals that likely violated platform rules until their posts had been adjudicated.

System 🤖