PimEyes includes 'poentially explicit' kids photos in search results

Occurred: 

Can you improve this page?
Share your insights with us

PimEyes was accused of making it distrubingly easy to find 'potentially explicit' photographs of children in its search engine results, raising fears about privacy and its use by stalkers and predators.

An investigation by The Intercept using AI-generated photos of children found that PimEyes allowed anyone to search for images of kids scraped from across the internet, including from charity group and educational websites, some of which their provided personal details. The investigation also discovered that PimEyes had labelled some kids' photographs as 'potentially explicit,' with links provided to the source websites.

PimEyes says that it is only meant to be used for self-searches and is 'not intended for the surveillance of others.' But it allows subscribers to search up to 25 times per day. PimEyes CEO Giorgi Gobronidze responded by saying many of PimEyes’s subscribers are women and girls searching for revenge porn images of themselves.

Databank

Operator: Mara Hvistendahl
Developer: PimEyes
Country: Global
Sector: Technology
Purpose: Identify individuals
Technology: Facial recognition
Issue: Governance; Privacy; Safety
Transparency: Governance