Microsoft Tay chatbot

Released: March 2016

Can you improve this page?
Share your insights with us

Microsoft Tay was an AI chatbot developed by Microsoft and designed to mimic a 19-year-old American girl. Microsoft's objective was to improve the model by learning from interacting with human beings on Twitter. 

Twitter users quickly started trolling Tay, resulting in the bot spewing tens of thousands of racist, homophobic and anti-semitic tweets.

Initially, Microsoft deleted unsafe tweets by Tay, before suspending the bot's Twitter profile, saying it suffered from a 'coordinated attack by a subset of people' that had 'exploited a vulnerability in Tay.' 

A few days later, Microsoft accidentally re-released Tay on Twitter while testing it, only for it to get stuck in a repetitive loop  tweeting 'You are too fast, please take a rest'.

Microsoft pulled Tay a few hours later and apologised.

Operator: Microsoft
Developer: Microsoft

Country: USA

Sector: Media/entertainment/sports/arts

Purpose: Train language model

Technology: Chatbot; NLP/text analysis; Deep learning; Machine learning
Issue: Bias/discrimination - race, ethnicity, gender, religion; Safety; Ethics

Transparency: Governance; Black box

Page info
Type: System
Published: February 2023