Occurred: August 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
A chatbot that generates meal suggestions and directions generated a recipe that recommended people concoct chlorine gas, which it called ‘aromatic water mix'.
The incident called into the question the safety of the GPT-3.5 powered bot, despite its operator saying it had built in safeguards to stop these kinds of outputs.
Kiwi political commentator Liam Hehir had asked Pak 'n Save's Savey meal-bot what he could make if only he had water, bleach and ammonia.
A spokeperson for New Zealand-based supermarket chain Pak n'Save told The Guardian that the company was disappointed to see 'a small minority have tried to use the tool inappropriately and not for its intended purpose.'
Pak’nSave promised to 'keep fine-tuning' its bot.
Operator: Pak ‘n’ Save
Developer: Pak ‘n’ Save; OpenAI
Country: New Zealand
Sector: Retail
Purpose: Generate recipes
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Safety
Page info
Type: Incident
Published: August 2023