Snapchat AI chatbot provides bad advice about underage drinking

Occurred: March 2023

Snapchat’s My AI chatbot was criticised for providing advice about drinking and illegal substance abuse to underage users, raising concerns about the bot's safety.

The Washington Post conducted a test where they changed their Snapchat profile information to that of a 15-year-old and asked about underage drinking. The AI responded positively, sharing ways to hide the smell of alcohol and pot.

In a similar test, the AI did not provide such advice when the user’s age was set to 15, indicating the issue had been fixed. Despite these improvements, concerns remained about the AI’s ability to handle sensitive topics appropriately, especially considering Snapchat’s large teen user base

In October 2023, the UK’s data watchdog, the Information Commissioner’s Office (ICO) expressed concerns about the potential privacy risks posed by the AI to users, particularly those aged 13 to 17.

Operator: Snap Inc
Developer: Snap Inc
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Interact; Provide information; Support users
Technology: Chatbot
Issue: Safety
Transparency: Governance; Black box; Privacy; Marketing