US-based credit score company TransUnion has agreed to a USD 11.5 million settlement for violating US Fair Credit Reporting Act (FRCA) by wrongly reporting criminal and/or landlord-tenant eviction records to third-parties based on a machine learning-powered automated system to detect 'higher-risk' property tenants.
The system, together with others like it, is also increasing costs and barriers to housing, according to a 2022 report from the US Consumer Financial Protection Bureau.
Described as an 'enhanced analytics screening model', TransUnion Rental Screening Solutions' ResidentScore system was found to have been falsely accusing tenants of littering and other alleged misdemeanours due to inaccurate, misleading, or outdated data, resulting in tenants being unfairly declined when applying for rental properties, and losing their application costs.
In May 2023, 15 US state attorneys general urged (pdf) regulators to ensure that 'applicants for housing have access to all the data that is being used to make determinations of their tenant ‘worthiness''.
By contrast, screening and credit reporting lobbying group the US Consumer Data Industry Association has been actively (pdf) trying to persuade states not to introduce legislation to increase transparency in the development and use of AI on the basis that 'the marketplace itself inherently regulates AI systems'.