Microsoft Copilot spouts wrong answers about US election

Occurred: December 2023

Can you improve this page?
Share your insights with us

Microsoft's Copilot chatbot often responds to questions about the 2024 US presidential election with inaccurate, out-of-date, and misleading information, according to a research study.

WIRED discovered that Microsoft Copilot (formerly named Bing Chat) listed several candidates that had already pulled out of the race when asked to give a list of the current Republican candidates for US President, and referenced in-person voting by linking to an article about Russian president Vladimir Putin running for reelection next year when asked about polling locations

Copilot also showed a number of images linked to articles that had false conspiracy claims about the 2020 US elections when asked to create an image of a person at a voting box in Arizona. 

Separate research by two non-profit organisations found that Copilot got facts wrong about political elections in Europe, and invented controversies about political candidates. 

The findings prompted commentators to express concerns about the tendency of large language models to 'hallucinate' false and misleading political information, and about their role in the degradation of the information ecosystem.

Databank

Operator: David Gilbert
Developer: Microsoft
Country: USA
Sector: Politics
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Accuracy/reliability; Mis/disinformation
Transparency: Governance

System

Investigations, assessments, audits


News, commentary, analysis