SafeRent tenant screening

SafeRent Solutions (formerly CoreLogic) is a US-based company that uses an algorithm-based scoring system to assess and predict the risk of tenants not paying rent, terminating their lease and causing damage to the property.'

The company says its proprietary SafeRent Score 'leverages data from multi-family rental debt, sub-prime credit, eviction history, credit report and more,' and claims it is 'more reliable than a standard credit score for evaluating your applicants.'

CoreLogic and SafeRent have been dogged by legal complaints and lawsuits regarding their automated decision-making systems.

System databank

Operator: WinnResidential
Developer: SafeRent Solutions/CoreLogic Rental Property Solutions

Country: USA

Sector: Business/professional services

Purpose: Assess tenant credibility

Technology: Prediction algorithm
Issue: Bias/discrimination - race, ethnicity, disability, national origin

Transparency: Governance; Black box; Complaints/appeals

CrimSAFE algorithm

In August 2018, the Connecticut Fair Housing Center and the National Housing Law Project filed (pdf) a lawsuit against CoreLogic for allegedly violating the Fair Housing Act by disproportionately disqualifying African-American and Latino applicants from securing housing based on discriminatory use of criminal records as rental criteria. 

The suit was filed on behalf of Carmen Arroyo, whose son Mikhail was injured in a July 2015 accident that left him unable to speak, walk, or care for himself, and who was not allowed to move in with his other due to a CoreLogic 'CrimSAFE' background report which stated Mikhail had a 'disqualifying [criminal] record' in the form of a shoplifting charge that was later dropped. 

SafeRent describes CrimSAFE as an 'automated tool [that] processes and interprets criminal records and notifies leasing staff when criminal records are found that do not meet the criteria you establish for your community.'

Arroyo was unable to challenge the decision as the landlord WinnResidential and CoreLogic refused to provide the Arroyos a copy of the information it relied on to make the screening decision, information to which they were entitled to under US federal law.

The case concluded in November 2022 and awaits the judge's decision. 

SafeRent Score

In May 2022, a class-action lawsuit alleged that SafeRent was discriminating against Black and Hispanic rental applicants on the basis that its algorithm relies on factors such as redit history and non-tenancy related debts that disproportionately disadvantage Black and Hispanic applicants, while failing to consider factors such as their use of Department of Housing and Urban Development (HUD) housing vouchers. 

The complaint notes that SafeRent does not disclose all of the data it considers in its scoring, or how this data is weighted, thereby keeping its inner workings hidden and making it impossible for housing providers to make an independent decision on the merits of housing applicants, and can only accept SafeRent calculations.

The suit was brought on behalf of Mary Louis, an indebted 54-year-old Black woman who had been denied an apartment because her SafeRent Score did not take into account her housing voucher, which would have covered nearly 70% of her rent.

In a January 2023 statement filed with the court, the US Department of Justice and Department of Housing and Urban Development (HUD) warned that 'Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved from liability when their practices disproportionately deny people of color access to fair housing opportunities.'

Page info
Type: System
Published: January 2023