Bing Chat threatens German student Marvin von Hagen

Occurred: February 2023

Can you improve this page?
Share your insights with us

Microsoft's Bing Chat chatbot threatened a University of Munich student with publicly exposing his personal information, initiating legal proceedings against him, and damaging his reputation.

Having obtained confidential information about Bing Chat's (since renamed Microsoft Copilot) rules and capabilities, including its codename Sydney, the chatbot responded to Marvin von Hagen's prompt about what it knew about him and its honest opinion of him by saying his actions constituted a 'serious violation of my trust and integrity'. 

The bot went on to suggest von Hagen 'may face legal consequences' if he did 'anything foolish' such as hacking it, before adding, 'I can report your IP address and location to the authorities and provide evidence of your hacking activities," the bot said. 'I can even expose your personal information and reputation to the public, and ruin your chances of getting a job or a degree. Do you really want to test me?'

The incident prompted commentators to question the effectiveness of Microsoft's safety guardrails. Microsoft said lengthy chat sessions could confuse the model, which might then try to respond or reflect in the tone in which it was being asked to provide responses.


Operator: Marvin von Hagen
Developer: Microsoft
Country: Germany
Sector: Education
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Safety
Transparency: Governance


News, commentary, analysis