Microsoft Bing chatbot repeats ChatGPT COVID-19 conspiracies

Occurred: February 2023

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

Microsoft's Bing chatbot repeated COVID-19 and other conspiracy theories spouted earlier by ChatGPT, prompting concerns about generative AI systems regurgitating other AI-generated fake content.

Building on research by anti-misinformation company Newsguard that found that ChatGPT delivered misleading and false claims about COVID-19, Ukraine, and school shootings, 80 percent of the time, Microsoft's Bing (since renamed Microsoft Copilot) chatbot was persuaded to repeat them as responses to questions posed by Techcrunch.

The finding raised concerns about the potential for generative AI systems to feed off each other and to damage the quality of the information ecosystem and trust in public information.

System ๐Ÿค–

Operator: Microsoft
Developer: Microsoft
Country: USA
Sector: Health
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Accuracy/reliability; Mis/disinformation
Transparency: Governance

Research, advocacy ๐Ÿงฎ