US mortgage credit score data economic, racial bias 

Occurred: August 2021

Can you improve this page?
Share your insights with us

Stanford University's Laura Blattner and Scott Nelson at the University of Chicago have discovered that predictive tools used to assess mortgage loans are biased against people on low incomes and from ethnic minorities.

Using consumer data and artificial intelligence to test different credit-scoring models, the two find that the accuracy of the underlying data in predicting creditworthiness is more responsible for producing biased outcomes than credit score algorithms, usually because low-income borrowers have limited credit histories. 

Operator: Unnamed
Developer: Unnamed
Country: USA
Sector: Banking/financial services
Purpose: Calculate credit score; Predict loan default
Technology: Credit score algorithms
Issue: Accuracy/reliability; Bias/discrimination - economic, racial
Transparency: Black box