Study: US mortgage loans assessment tools suffer from economic, racial bias 

Occurred: August 2021

US mortgage loan assessment tools may suffer from economic and racial bias, according to Stanford Graduate School of Business researchers.

Laura Blattner and Scott Nelson used artificial intelligence to test alternative credit-scoring models, finding that the predictive tools were between 5 and 10 percent less accurate for lower-income families and minority borrowers than for higher-income and non-minority groups.

The researchers pointed out that the issue was not that the credit score algorithms themselves are biased against disadvantaged borrowers. Instead, the underlying data was less accurate in predicting creditworthiness for these groups, often because these borrowers had limited credit histories

A “thin” credit history will in itself lower a person’s score, because lenders prefer more data than less. But it also means that one or two small dings, such as a delinquent payment many years in the past, can cause outsized damage to a person’s score.

The study highlighted the need for further investigation and potential reform in the mortgage lending industry to ensure fairness and equality.

System 🤖

Country: USA
Sector: Banking/financial services
Purpose: Calculate credit score; Predict loan default
Technology: Credit score algorithms
Issue: Accuracy/reliability; Bias/discrimination - economic, racial
Transparency: Black box

Page info
Type: Incident
Published: August 2021
Last updated: June 2024