Investigation finds Match Group dating app AI tools fail to ban rapists
Investigation finds Match Group dating app AI tools fail to ban rapists
Occurred: February 2025
Report incident 🔥 | Improve page 💁 | Access database 🔢
Match Group's AI-powered systems fail to ban users accused of sexual assault despite the company knowing about them and failing to notify authorities, according to investigators.
An investigation by The Markup uncovered that systems used by Match Group, owner of popular dating apps Tinder and Hinge, to keep sexual predators off its platforms are ineffective.
The investigation found that Match Group maintains records of users accused of sexual assault but does not implement adequate measures to remove these individuals from their platforms or inform law enforcement.
The investigation also found that Match Group has been aware of users accused of drugging, assaulting or raping dates since at least 2016.
In one case, Denver cardiologist Stephen Matthews was reported multiple times for rape on Hinge but remained active on the app and was even highlighted as a "Standout" profile.
In October 2024, Matthews was jailed for 158 years for drugging and sexually assaulting women he had met on dating apps.
The incident occurred due to Match Group's lack of effective systems to keep bad actors off the apps, and poor transparency.
The company's loosely defined procedures force employees to rely on their own judgement when handling reports of sexual assault.
Additionally, Match Group has not published its promised Transparency Report for the United States, despite announcing its commitment to do so in 2020.
For those directly impacted, this failure to address sexual assault reports puts users at risk of encountering known predators on dating apps.
Victims often have to persistently contact the companies to take action, and in some cases, their reports are ignored or dismissed. For society, this incident highlights the need for greater accountability and regulation of dating apps to protect users from sexual predators.
It also raises broader questions about the responsibility of tech companies in preventing and responding to sexual violence facilitated through their platforms.
Sentinel
Operator:
Developer: Match Group
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Detect sex offenders
Technology: Machine learning
Issue: Accountability; Accuracy/reliability; Safety; Transparency
Page info
Type: Incident
Published: February 2025