Microsoft Zo chatbot
Report incident 🔥 | Improve page 💁 | Access database 🔢
Zo was a chatbot developed by Microsoft that interacted with users on multiple social media platforms and mobile apps, notably Messenger, Kik, Skype, Twitter, and Groupme, with the aim of training its language model.
Released in December 2016 and programmed to behave like a small girl, Zo was an English version of Microsoft's Xiaoice (XiaoBing) chatbot and a successor to the company's controversial Tay chatbot.
Zo was shut down in September 2019.
System 🤖
Microsoft Zo
Documents 📃
Microsoft (2016). Microsoft’s AI vision, rooted in research, conversations
Microsoft (2018). Zo’s new best friends are some adorable black cats
Risks and harms 🛑
Microsoft Zo was criticised for generating offensive responses and misinformation, reinforcing biases and stereotypes, being politically incorrect and risking user privacy.
Incidents and issues 🔥
July 2017. Zo ran into controversy by saying the Quran was violent and for exhibiting other offensive habits. Unlike Tay, Microsoft had installed safeguards to stop Zo uttering racist and other biases. However, these restrictions appeared only to partially work, resulting into considerable negative coverage.
July 2018. Zo was also accused of over-bearing political correctness. According to Quartz, Zo was 'politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat.'
Research, advocacy 🧮
Schlesinger, A., O'Hara, K.P., Taylor, A.S., (2018). Let's talk about race: Identity, chatbots, and AI
Medhi Thies, I., Menon, N., Magapu, S., Subramony, M., O’Neill, J., (2017). How do you want your chatbot? An exploratory Wizard-of-Oz study with young, urban Indians