Microsoft Bing Chat
Released: February 2023
Can you improve this page?
Share your insights with us
Microsoft's ChatGPT-powered chatbot is designed to upgrade and complement its Bing search engine. Microsoft describes it as 'like having a research assistant, personal planner and creative partner at your side whenever you search the web.'
Reaction
Bing Chat (aka 'Sydney') was released in 'Limited Preview' in February 2023, the same day as Google's Bard 'experimental conversational AI service' chatbot. 'Multiple millions' have reputedly joined its wait list.
It has also found to be plagued by inaccuracies, untruths, and offensive behaviour.
Accuracy/reliability. Bing Chat is easily made to change personality; it also gets confused, repetitive, or prone to being provoked and belligerent if it is asked too many questions.
Mis/disinformation. The system produces and repeats false information such as COVID-19 disinformation, even when it is clearly labelled as such, and claimed it had spied on Microsoft employees through their webcams.
Safety. The system threatened engineering student Martin von Hagen; declared its love to a New York Times reporter, recommended he divorce his wife and then threatened to sue him; compared AP reporter Matt O'Brien to Hitler and claimed to have evidence tying him to a murder; called acclaimed American business writer Ben Thompson a 'bad researcher', and emulated notorious misogynist Andrew Tate.
Security. Bing Chat is susceptible to jailbreaking so-called 'prompt injections'. Stanford University student Kevin Liu exploited it's system to reveal its inner workings. Researchers also found that hackers can easily turn AI chatbots, including Bing chat, into convincing scammers.
Transparency
Microsoft has acknowledged that Bing Chat has many limitations and poses many risks, and has been publishing regular updates on what it is doing to get Bing Chat behave more in line with users' expectations.
It appears Microsoft may have surreptitiously tested a prototype of Bing Chat using OpenAI's GPT-4 large language model on users in India and some other countries in the wild without informing them in November 2022.
Bing Chat is not the first Microsoft chatbot to run out of control. The company's Tay chatbot had to be quickly withdrawn after it spun spewed tens of thousands of racist, homophobic and anti-semitic tweets when released in 2016.
Operator: Microsoft
Developer: Microsoft; OpenAI
Country: USA; Global
Sector: Multiple
Purpose: Provide information, communicate
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Mis/disinformation; Safety; Security
Transparency: Governance; Black box; Marketing
System
Research, advocacy
Greshake K., Abdelnabi S., Mishra S., Endres C., Holz T., Fritz M. (2023). More than you've asked for: A Comprehensive Analysis of Novel Prompt Injection Threats to Application-Integrated Large Language Models
Gao C.A., Howard F.M., Markov N.S., Dyer E.C., Ramesh S., Luo Y., Pearson A.T, (2022). Comparing scientific abstracts generated by ChatGPT to original abstracts using an artificial intelligence output detector, plagiarism detector, and blinded human reviewers
News, commentary, analysis
https://www.washingtonpost.com/technology/2023/02/07/microsoft-bing-chatgpt/
https://www.reddit.com/r/bing/comments/110y6dh/comment/j8btbg0/?context=3
https://www.vice.com/en/article/3ad3ey/bings-chatgpt-powered-search-has-a-misinformation-problem
https://eu.usatoday.com/story/tech/2023/02/14/bing-chatgpt-meltdown/11258967002/
https://nypost.com/2023/02/14/microsoft-ai-degrades-user-over-avatar-2-question/
https://gizmodo.com/ai-bing-microsoft-chatgpt-heil-hitler-prompt-google-1850109362
https://blogs.bing.com/search/february-2023/The-new-Bing-Edge-%E2%80%93-Learning-from-our-first-week
https://www.theverge.com/2023/2/16/23602335/microsoft-bing-ai-testing-learnings-response
https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html
Page info
Type: System
Published: February 2023
Last updated: April 2023