Microsoft Bing Chat
Microsoft's ChatGPT-powered chatbot is designed to upgrade and complement its Bing search engine. Microsoft describes it as 'like having a research assistant, personal planner and creative partner at your side whenever you search the web.'
It has also been found to be plagued by inaccuracies, untruths, and offensive behaviour.
Bing Chat has been criticised for its poor safety record. Amongst other incidents, it has:
threatened engineering student Martin von Hagen
declared its love to a New York Times reporter, recommended he divorce his wife and then threatened to sue him
compared AP reporter Matt O'Brien to Hitler and claimed to have evidence tying him to a murder
called acclaimed American business writer Ben Thompson a 'bad researcher'
emulated notorious misogynist Andrew Tate.
Microsoft has acknowledged that Bing Chat has many limitations and poses many risks, and has been publishing regular updates on what it is doing to get Bing Chat behave more in line with users' expectations.
It appears Microsoft may have surreptitiously tested a prototype of Bing Chat using OpenAI's GPT-4 large language model on users in India and some other countries in the wild without informing them in November 2022.
Developer: Microsoft; OpenAI
Country: USA; Global
Purpose: Provide information, communicate
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Accuracy/reliability; Mis/disinformation; Safety; Security
Transparency: Governance; Black box; Marketing
Greshake K., Abdelnabi S., Mishra S., Endres C., Holz T., Fritz M. (2023). More than you've asked for: A Comprehensive Analysis of Novel Prompt Injection Threats to Application-Integrated Large Language Models
Gao C.A., Howard F.M., Markov N.S., Dyer E.C., Ramesh S., Luo Y., Pearson A.T, (2022). Comparing scientific abstracts generated by ChatGPT to original abstracts using an artificial intelligence output detector, plagiarism detector, and blinded human reviewers