Microsoft Copilot provides wrong Germany, Swiss election information

Occurred: December 2023

Can you improve this page?
Share your insights with us

Microsoft's Copilot AI chatbot got facts wrong about political elections in Europe, and invented controversies about political candidates, according to research by two non-profit organisations. 

According (pdf) to AI Forensics and AlgorithmWatch, Microsoft Copilot provided incorrect dates and names of outdated candidates for recent elections in Germany and Switzerland. It also found that the chatbot performed worse in languages other than English, notably German and French.

The study also found the chatbot made up false stories about the candidates, for instance stating that German politician Hubert Aiwanger had been involved in a controversy regarding the distribution of leaflets that spread misinformation about COVID-19 and the vaccine, seemingly drawing on information about Aiwanger that came out in August 2023 where he had spread 'antisemitic leaflets' in high school over 30 years ago. 

The findings prompted commentators to express concerns about the tendency of large language models to 'hallucinate' false and misleading political information, and about their role in the degradation of the information ecosystem.


Operator: AlgorithmWatch; AI Forensics
Developer: Microsoft; OpenAI
Country: Germany; Switzerland
Sector: Politics
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Accuracy/reliability; Mis/disinformation
Transparency: Governance


Investigations, assessments, audits

News, commentary, analysis