Study: Instagram enables global paedophile network 

Occurred: June 2023

Can you improve this page?
Share your insights with us

Instagram's recommendation system has been actively facilitating the spread and sale of child sexual abuse material (CSAM), according to an investigation by the Wall Street Journal and researchers at the Stanford Internet Observatory and University of Massachusetts Amherst.

Accounts discovered by the researchers were advertised using hashtags like #pedowhore and #pedobait, directing users to 'menus' of content where they can buy videos and images, including of of self-harm and bestiality. 

The researchers reckoned the size of the seller network ranged between 500 and 1,000 accounts at any one time, and communicated through Instagram's direct messaging function.

Meta said it would start a new task force to investigate and address the issues raised by the investigation. An April 2023 Guardian investigation documented how Meta had been struggling to prevent paedophiles and others from using its platforms to buy and sell children for sex.

Operator: Meta/Instagram
Developer: Meta/Instagram

Country: USA; Global

Sector: Media/entertainment/sports/arts

Purpose: Recommend content

Technology: Recommendation algorithm
Issue: Safety

Transparency: Governance; Black box

Page info
Type: Incident
Published: June 2023