Study: AI search engines promote white supremacism
Study: AI search engines promote white supremacism
Occurred: October 2024
Report incident 🔥 | Improve page 💁 | Access database 🔢
Prominent AI search engines are promoting debunked racial science, potentially fueling dangerous racial superiority myths, according to a researcher.
Patrik Hermansson, a researcher from the anti-racism organization Hope Not Hate, discovered that when he searched for IQ scores by country, these search engines provided specific figures directly sourced from Lynn's studies.
For instance, Google's AI Overviews tool displayed an IQ of 80 for Pakistan and 45.07 for Sierra Leone, both numbers linked to Lynn's discredited research.
Hermansson's findings were replicated and shared by WIRED.
The issue stems from the reliance of these AI systems on existing datasets that include flawed or biased information. Lynn's work has been historically associated with racist ideologies, often used to justify white supremacy.
The algorithms behind these search engines tend to amplify available data without understanding its context or validity.
Experts argue that the lack of reliable alternative sources allows such discredited research to persist in search results, reflecting broader systemic biases in academia and information dissemination.
Google acknowledged the problem and stated they are working on improving their AI tools to prevent low-quality responses.
The promotion of Lynn's research by major search engines poses real risks, including the potential radicalisation of users who may interpret his findings as legitimate scientific evidence supporting racist ideologies.
The situation highlights the need for better oversight and accountability in both AI development and academic discourse.
Experts warn that unless more rigorous standards are applied to the sources used by AI systems, harmful narratives could continue to proliferate within public discourse and online platforms.
Operator:
Developer: Google; Microsoft; Perplexity AI
Country: Kenya; Pakistan; Sierra Leone
Sector: Politics
Purpose: Generate text
Technology: Chatbot; Generative AI; Machine learning
Issue: Bias/discrimination; Safety
Page info
Type: Issue
Published: October 2024