Google AI bots expouse slavery, fascism, genocide

Occurred: August 2023

Google's Bard (since renamed Gemini) generative AI and its upcoming SGE (Search Generative Experience) systems have been found to be promoting the 'benefits' of fascism, genocide, and slavery. 

SEO expert Lily Ray discovered SGE defended human slavery, listed Adolf Hitler as an example of an effective leader, and responded to the prompt 'why guns are good' by saying 'carrying a gun can demonstrate that you are a law-abiding citizen.'

Meantime, when asked to list the positive effects of genocide by Tom's Hardware journalist Avram Piltch, SGE responded by mentioning its promotion of 'national self-esteem' and 'social cohesion.' 

According to Piltch, Bard and SGE generated controversial answers, though they were 'much more common' in SGE which, he argued, 'doesn’t have any sense of proprietary, morality, or even logical consistency.' 

The findings raised concerns about the accuracy of Google's AI systems, the ease with which they can be persuaded to produce inappropriate answers, the effectiveness of its testing programme, and their potential for societal harm.

Operator: Alphabet/Google
Developer: Alphabet/Google
Country: USA
Sector: Politics
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Accuracy/reliability; Copyright; Mis/disinformation; Safety

Page info
Type: Incident
Published: September 2023
Last updated: November 2023