Algorithm misses gambling addict red flags

Occurred: April 2021

Can you improve this page?
Share your insights with us

A gambling addict who committed suicide in April 2021 after racking up large debts had been categorised as a 'low-risk' customer by a Betfair algorithm that had 'found nothing in his betting patterns that would trigger human intervention that might have restricted his gambling.'

Luke Ashton, from Leicester, UK, was offered a free bet by gambling company Betfair and died after gambling over 100 times a day and building up debts of GBP 18,000.

Ashton's lawyer said the company relied on a machine learning algorithm that daily analysed 277 elements of its customers' betting activities to detect problem gamblers who would then be telephoned by its player protection team. 

He also said that Ashton had 'self-excluded' himself as high-risk on occasions in 2013, 2014 and 2016. Richard Clarke, managing director of customer relations for Betfair parent company Flutter UKI told the court that Mr Ashton had been sent eight automated and generic 'awareness' emails by the company.

Coroner Ivan Cartwright concluded Betfair failed to meaningfully interact or intervene when Mr Ashton's gambling activity spiked. 

Operator: Flutter UKI/Betfair
Developer:  
Country: UK
Sector: Gambling
Purpose: Detect customer risk; Track customer data
Technology: Machine learning
Issue: Accuracy/reliability; Safety
Transparency: Governance; Black box