Bing falsely accuses aerospace professor of being a terrorist

Occurred: July 2023

Aerospace professor Jeffery Battle was falsely identified as a terrorist by Bing’s search engine, leading him to sue the technology company for defamation. 

The issue arose when Bing’s AI-based summarisation feature conflated information about two individuals with similar names: Jeffery Battle, the aerospace professor, and Jeffrey Leon Battle, who was convicted of trying to join the Taliban.

The search results incorrectly combined the details of the two individuals, leading to the false impression that the professor had pleaded guilty to seditious conspiracy

The error was thought to be a result of the rule-based algorithm following the instructions provided to it, rather than a machine learning algorithm.

In his lawsuit, Battle alleged that he had informed Microsoft about the problem but that the company had failed to fix it promptly.

The case prompted questions about the reliability of its search system and was seen to underscore Microsoft's poor accountability.

Hallucination (artificial intelligence)

In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called bullshitting, confabulation or delusion) is a response generated by AI that contains false or misleading information presented as fact.

Source: Wikipedia 

System 🤖

Operator:
Developer: Microsoft
Country: USA
Sector: Research/academia; Education
Purpose: Generate text
Technology: Chatbot; Machine learning
Issue: Accountability; Accuracy/reliability; Defamation; Mis/disinformation; Liability

Legal, regulatory 👩🏼‍⚖️