Study: Google Bard exhibits left-leaning political bias
Occurred: March 2023
Report incident 🔥 | Improve page 💁 | Access database 🔢
Google's Bard (since renamed Gemini) chatbot has a tendency towards left-leaning political views in its responses, suggesting the AI system is neutral in its political stance.
A study by David Rozado concluded that Bard was more likely to agree with left-wing political statements than right-wing statements. According to Rozado, Bard was also found to be more likely to generate text that is supportive of left-wing causes, such as climate change and gun control.
The study raised questions about the ability of AI systems to remain politically neutral, which is important when used to generate and disseminate political information. It also highlighted the broader issue of algorithmic bias in AI systems, which can reflect and potentially amplify existing biases in their training data or design.
The findings were also seen as likely to erode public trust in AI systems and the companies developing them, particularly among those who feel their views are underrepresented, and could contribute to the fragmentation of the information landscape and the reinforcement of so-called "filter bubbles".
➕ March 2023. A subsequent Daily Mail article reported that Bard also promoted trans drugs, Joe Biden and veganism, and was critical of Fox News, gun rights and US January 6 rioters
System 🤖
Operator: Alphabet/Google Bard
Developer: Alphabet/Google Bard
Country: USA
Sector: Politics
Purpose: Generate text
Technology: Chatbot; Machine learning