Microsoft Zo chatbot

Zo was a chatbot developed by Microsoft that interacted with users on multiple social media platforms and mobile apps, notably Messenger, Kik, Skype, Twitter, and Groupme, with the aim of training its language model.

Released in December 2016 and programmed to behave like a small girl, Zo was an English version of Microsoft's Xiaoice (XiaoBing) chatbot and a successor to the company's controversial Tay chatbot.

Operator: Microsoft
Developer: Microsoft

Country: USA

Sector: Media/entertainment/sports/arts

Purpose: Train language model

Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Bias/discrimination - religion; Safety

Transparency: Governance

Risks and harms 🛑

Microsoft Zo was criticised for generating offensive responses and misinformation, reinforcing biases and stereotypes, being politically incorrect,and risking user privacy.

Zo ran into controversy by saying the Quran was violent and for exhibiting other offensive habits

Unlike Tay, Microsoft had installed safeguards to stop Zo uttering racist and other biases. However, these restrictions appeared only to partially work, resulting into considerable negative coverage. 

It was also accused of over-bearing political correctness. According to Quartz, Zo was 'politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat.'

Zo was shut down in September 2019.

Research, advocacy 🧮

Page info
Type: System
Published: February 2023
Last updated:  May 2924