Snapchat AI chatbot provides bad advice about underage drinking
Occurred: March 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
Snapchat’s My AI chatbot was criticised for providing advice about drinking and illegal substance abuse to underage users, raising concerns about the bot's safety.
The Washington Post conducted a test where they changed their Snapchat profile information to that of a 15-year-old and asked about underage drinking. The AI responded positively, sharing ways to hide the smell of alcohol and pot.
In a similar test, the AI did not provide such advice when the user’s age was set to 15, indicating the issue had been fixed. Despite these improvements, concerns remained about the AI’s ability to handle sensitive topics appropriately, especially considering Snapchat’s large teen user base.
In October 2023, the UK’s data watchdog, the Information Commissioner’s Office (ICO) expressed concerns about the potential privacy risks posed by the AI to users, particularly those aged 13 to 17.
System 🤖
Operator: Snap Inc
Developer: Snap Inc
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Interact; Provide information; Support users
Technology: Chatbot
Issue: Safety
Transparency: Governance; Black box; Privacy; Marketing
News, commentary, analysis 🗞️
https://www.washingtonpost.com/technology/2023/03/14/snapchat-myai/
https://www.makeuseof.com/snapchat-my-ai-ethical-security-issues/
https://www.cnbc.com/video/2023/03/14/snachat-chatbot-offers-inappropriate-advice-for-a-minor.html
https://www.fastcompany.com/90865731/snapchat-ai-could-be-creepiest-chatbot-yet
Page info
Type: Incident
Published: April 2024