Facebook fails to label 42 percent of debunked political misinformation

Occurred: October 2020

Facebook failed to stem the tide of misinformation accompanying the 2020 US Presidential election, according to researchers.

Researchers at online advocacy group Avaaz discovered that Facebook failed to label 42 percent of debunked political misinformation, thereby helping the spread of false narratives during the 2020 election period.

The researchers calculated that Facebook could have prevented approximately 10.1 billion views on pages that disseminated misinformation had it implemented algorithm changes earlier. 

The top 100 pages identified as "repeat misinformers" shared an average of eight confirmed pieces of misinformation each, with many refusing to correct their posts after being flagged by fact-checkers.

Facebook had only intensified its efforts to combat misinformation in the weeks leading up to the election. Prior to this, the platform allowed substantial engagement with misleading content, which included posts that were later found to be false but were not adequately addressed by Facebook's AI-powered content management system.

Facebook disputed Avaaz's methodology, claiming it misrepresented the company's actions against misinformation. A spokesperson argued that the report failed to consider the context of individual pages and their content comprehensively. 

Nonetheless, tyhe report was seen to highlight broader concerns about Facebook's role in shaping public discourse and its responsibility in mitigating the spread of false information that can influence democratic processes.

Facebook content management controversies

Facebook or Meta Platforms has been criticized for its management of various content on posts, photos and entire groups and profiles. This includes but is not limited to allowing violent content, including content related to war crimes, and not limiting the spread of fake news and COVID-19 misinformation on their platform, as well as allowing incitement of violence against multiple groups.

Source: Wikipedia 🔗

System 🤖

Operator:
Developer: Facebook
Country: Global
Sector: Politics
Purpose: Moderate content
Technology: Content moderayion system; Machine learning
Issue: Accuracy/reliability; Effectiveness/value