Facebook Cross-check VIP whitelisting
Cross-check (or 'XCheck') is a secretive system run by Facebook that double checks the posts of over 5 million 'VIP' users, including Donald Trump, Elizabeth Warren, and Brazilian footballer Neymar.
According to Meta VP Nick Clegg, the cross-check system aims to prevent potential over-enforcement ('when we take action on content or accounts that don’t actually violate our policies') and to double-check cases where there could be a higher risk for a mistake or when the potential impact of a mistake is especially severe.
A September 2021 Wall Street Journal investigation found people added to the cross-check list are permitted to post rule-breaking content, including harassment, incitement to violence, and misinformation and disinformation, and are immune from Facebook enforcement actions.
In December 2022, Facebook’s Oversight Board criticised Facebook owner Meta for giving cross-check users undue deference, understaffing, opacity, and unfairness. It found that 'Cross-check is currently neither designed nor implemented in a manner that meets Meta’s human rights responsibilities and company values.'
The board also argued the programme 'put Meta’s business interests over the program’s stated goals of protecting public discourse.' It went on to say that 'despite significant public concern about the program, Meta has not effectively addressed problematic components of its system.'
A confidential 2019 internal review of Facebook’s whitelisting practices discovered by the WSJ investigation said 'We are not actually doing what we say we do publicly', and goes on call cross-check 'a breach of trust'.
According to the Oversight Board, 'Meta has repeatedly told the board and the public that the same set of policies apply to all users. Such statements and the public-facing content policies are misleading.'
'Meta does not inform users that they are on cross-check lists and does not publicly share its procedures for creating and auditing these lists. It is unclear, for example, whether entities that continuously post violating content are kept on cross-check lists based on their profile. This lack of transparency impedes the Board and the public from understanding the full consequences of the programme', the board observed.
It recommended that Meta 'radically increase transparency around cross-check and how it operates. Meta should measure, audit and publish key metrics around its cross-check programme so that it can tell whether the programme is working effectively.'
In March 2023, Meta agreed to the Oversight Board's recommendation to publish regular Transparency Reports onCross-check and to limit the distribution of content from high-profile individuals that likely violated platform rules until their posts had been adjudicated.
Purpose: Moderate content
Technology: Content moderation system
Issue: Governance; Fairness
Transparency: Governance; Complaints/appeals; Marketing
Meta (2022, updated 2023). Oversight Board Selects a PAO on Meta's Cross-check Policies
Meta (2022). Reviewing high-impact content accurately via our cross-check system
Meta (2021). Requesting Oversight Board Guidance on Our Cross-Check System
Investigations, assessments, audits
Oversight Board (2021). Policy Advisory Opinion: Meta's Cross-check programme
Oversight Board (2021). Full Policy Advisory Opinion on Meta's Cross-check progam (pdf)
Oversight Board (2021). Oversight Board demands more transparency from Facebook
Oversight Board (2021). Oversight Board opens public comments for policy advisory opinion on cross-check
News, commentary, analysis
Published: September 2021
Last updated: March 2023