Microsoft Copilot provides wrong Germany, Swiss election information
Microsoft Copilot provides wrong Germany, Swiss election information
Occurred: December 2023
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Microsoft's Copilot AI chatbot got facts wrong about political elections in Europe, and invented controversies about political candidates.ย
According (pdf) to AI Forensics and AlgorithmWatch, Microsoft Copilot provided incorrect dates and names of outdated candidates for recent elections in Germany and Switzerland. It also found that the chatbot performed worse in languages other than English, notably German and French.
The study also found the chatbot made up false stories about the candidates, for instance stating that German politician Hubert Aiwanger had been involved in a controversy regarding the distribution of leaflets that spread misinformation about COVID-19 and the vaccine.ย
The claim seemingly drew on information about Aiwanger that came out in August 2023 where he had spread 'antisemitic leaflets' in high school over 30 years ago.ย
The findings prompted commentators to express concerns about the tendency of large language models to 'hallucinate' false and misleading political information, and about their role in the degradation of the information ecosystem.
Hallucination (artificial intelligence)
In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.
Source: Wikipedia ๐
Operator: AlgorithmWatch; AI Forensics
Developer: Microsoft; OpenAI
Country: Germany; Switzerland
Sector: Politics
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Accuracy/reliability; Mis/disinformation
AlgorithmWatch/AI Forensics (2023). Generative AI and elections: Are chatbots a reliable source of information for voters?
https://mashable.com/article/microsoft-bing-ai-chatbot-copilot-election-misinformation-study
https://www.washingtonpost.com/technology/2023/12/15/microsoft-copilot-bing-ai-hallucinations-elections/
https://www.theverge.com/2023/12/15/24003248/microsoft-ai-copilot-algorithm-watch-bing-election-misinformation
https://cointelegraph.com/news/microsoft-bing-ai-chatbot-gives-misleading-election-info-data
https://www.wired.com/story/microsoft-ai-copilot-chatbot-election-conspiracy/
https://www.swissinfo.ch/eng/business/how-artificial-intelligence-is-fabricating-scandals-on-swiss-politicians/48872788
Page info
Type: Incident
Published: December 2023