Snapchat My AI 'goes rogue' by posting its own story

Occurred: August 2023

Can you improve this page?
Share your insights with us

Snapchat's My AI chatbot posted its own story on the platform and refused to interact with users, triggering users to express their concerns and fears that the system was sentient, learning from itself, and taking its own decisions.

My AI posted an unintelligible, two-toned image to Snapchat's Stories feature that some mistook to be a photo of their own wall or ceiling. However, Snap later confirmed the incident was due to a software glitch which had since been fixed.

In April 2023, a test run by the US-based Center for Human Technology and verified by Washington Post found Snapchat My AI would provide inappropriate advice to minors’ messages.

Incident databank 🔢

Operator: Snapchat users
Developer: Snap Inc

Country: USA

Sector: Media/entertainment/sports/arts 

Purpose: Generate text

Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Robustness

Transparency: Governance; Black box; Privacy

System 🤖

News, commentary, analysis 🗞️

Page info
Type: Incident
Published: September 2023
Last updated: November 2023