Microsoft Copilot chatbot

Microsoft Copilot is a chatbot developed and operated by Microsoft and powered by OpenAI's GPT-4 large language model, Microsoft's Prometheus model and OpenAI’s text-to-image generative AI system DALL-E 3.

Launched as 'Bing Chat' in February 2023, the chatbot was renamed 'Microsoft Copilot' in September 2023 and rolled out across multiple Microsoft platforms.

Operator: Microsoft
Developer: Microsoft
Country: Global
Sector: Multiple
Purpose: Provide information, communicate
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Copyright; Mis/disinformation; Safety; Security
Transparency: Governance; Black box; Marketing

Risks and harms 🛑

Microsoft's Copilot chatbot has been criticised for perpetuating biases, and generating facilitating unethical or inappropriate content, including misinformation, amongst others.

Accuracy/reliability

Copilot is easily made to change personality; it also gets confused, repetitive, or prone to being provoked and belligerent if it is asked too many questions.

Mis- and disinformation

Like other generative AI systems, Copilot produces inaccurate, misleading, and false information.

Safety

Bing Chat has been criticised for its poor safety record. Amongst other things, the bot:  

Security

Bing Chat/Microsoft Copilot have been found to be susceptible to jailbreaking so-called 'prompt injections'. 

Transparency 🙈

Microsoft acknowledged that Bing Chat had many limitations and posed many risks, and published regular updates on what it was doing to get Bing Chat behave more in line with users' expectations.

It appeared Microsoft may have surreptitiously tested a prototype of Bing Chat using OpenAI's GPT-4 large language model on users in India and some other countries in the wild without informing them in November 2022.

Page info
Type: System
Published: February 2023
Last updated: March 2024