Facebook fails to manage Christchurch mosque shooting livestreaming
Facebook fails to manage Christchurch mosque shooting livestreaming
Occurred: March 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
Facebook came under fire for failing to control the spread of videos of the livestreaming of the 2019 mass shootings in Christchurch, New Zealand.
Brenton Tarrant livestreamed the first shooting at the town's Al Noor Mosque on Facebook Live using a head-mounted GoPro camera in the style of a first-person shooter video game. The link to the livestream was first posted on 8chan alongside links to Tarrant's manifesto. It was the first successfully live-streamed far-right terror attack.
Following the attack, the video quickly spread across various platforms, including Facebook, YouTube, Reddit and LiveLeak, prompting widespread criticism of the companies for their inability to effectively manage and remove such content.
Facebook reported that it removed approximately 1.5 million copies of the video within the first 24 hours after the attack, with many of these removals occurring automatically at the point of upload.
The graphic nature of the livestream, which depicted the mass murder from a first-person perspective, shocked viewers and caused widespread anxiety and trauma.
Many people, including children, inadvertently witnessed the violence, leading to a collective sense of horror and vulnerability in communities that had previously felt safe, and to the deterioration of community cohesion.
Critics also expressed concerns that amplifying such material could incite further violence. Following the attack, there was a noted increase in hate crimes against Muslim communities in a number of countries. The fear of reprisals or copycat attacks also became a significant concern for many communities, particularly those targeted by the shooter’s ideology.
The incident underscored concerns about how social media platforms handle violent content and the effectiveness of their increasingly AI-powered content moderation systems. Critics argued that Facebook and other firms must improve their ability to detect and remove extremist material and take more responsibility for it on their services.
In response, New Zealand and other countries began considering stricter regulations for content providers, including potential fines and imprisonment for executives who fail to remove violent imagery promptly.
Livestreamed crime
Livestreamed crime is a phenomenon in which people publicly livestream, (upload video and/or audio in real time) criminal acts on social media platforms such as Twitch or Facebook Live.
Source: Wikipedia 🔗
➕ May 2019. New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron co-hosted the Christchurch Call summit, urging technology companies to enhance their efforts against online extremism. The initiative garnered support from 53 countries and major technology firms, aiming to establish better standards for monitoring and controlling the spread of violent content online.
Facebook content moderation system
Operator: Facebook users
Developer: Facebook
Country: New Zealand
Sector: Religion; Politics
Purpose: Moderate content
Technology: Content moderation system; Machine learning
Issue: Accuracy/reliability; Mis/disinformationnce
https://www.nzherald.co.nz/business/news/article.cfm?c_id=3&objectid=12217454
https://www.wired.com/story/christchurch-shooter-youtube-radicalization-extremism/
https://www.nytimes.com/2019/03/15/technology/facebook-youtube-christchurch-shooting.html
https://time.com/5589478/facebook-livestream-rules-new-zealand-christchurch-attack/
Page info
Type: Incident
Published: September 2024