Instagram Reels discovered to recommend child-sexualising videos
Instagram Reels discovered to recommend child-sexualising videos
Occurred: November 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
Instagram's Reels video service serves 'explicit', 'risqué footage' of children to followers of teen and pre-teen 'influencers', according to an experiment conducted by The Wall Street Journal.
By setting up test accounts that followed young gymnasts, cheerleaders, and influencers, WSJ journalists found that Reels surfaced 'served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos.'
The Journal also found that the footage was mixed in with ads for companies including Disney, Walmart, Pizza Hut, Bumble, and Match Group. Several companies said they had suspended their advertising campaigns in the wake of the publication's expose.
Meta responded by telling its clients that it was investigating and that it 'would pay for brand-safety auditing services to determine how often a company’s ads appear beside content it considers unacceptable.'
Instagram content recommendation system
Operator: Wall Street Journal
Developer: Meta/Instagram
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation algorithm; Machine learning
Issue: Safety
Page info
Type: Incident
Published: December 2023