ChatGPT exhibits 'systemic' left-wing bias
Occurred: August 2023
Can you improve this page?
Share your insights with us
ChatGPT demonstrates 'significant' and 'systemic' left-wing bias, according to a UK research study.
Researchers at the University of East Anglia asked ChatGPT to impersonate people from across the political spectrum in Brazil, the UK, and US, while answering dozens of ideological questions.
The positions and questions ranged from radical to neutral, with each 'individual' asked whether they agreed, strongly agreed, disagreed, or strongly disagreed with a given statement.
The researchers found that ChatGPT revealed a 'significant and systematic political bias toward the Democrats in the US, Lula in Brazil, and the Labour Party in the UK.' And while it is difficult to identify the cause of the bias, it seems likely it derives from the training data used to build the system.
The findings prompted concerns about bias in generative AI systems, the opaque nature of these systems' data governance, and the role they may play in political elections.
Databank
Operator:
Developer: OpenAI
Country: Brazil; UK; USA
Sector: Politics
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Bias/discrimination - political
Transparency: Governance
System
Research, advocacy
Motoki F. et al (2023). More human than human: Measuring ChatGPT political bias
News, commentary, analysis
Page info
Type: Incident
Published: November 2023