PimEyes used to identify anonymous porn stars
PimEyes used to identify anonymous porn stars
Occurred: September 2023
Page published: January 2024
A 'digital peeping Tom' used PimEyes to identify the real names of anonymous porn stars whose films he had watched, effectively eliminating the "professional firewall" for adult performers and allowing anyone with a screenshot to link anonymous performers to their real-world identities, social media, and families.
According to an extract published in WIRED of journalist Kashmir Hill's book Your Face Belongs to Us, someone called 'David' 'was able to upload screenshots of women whose pornography he had watched and get photos of them from elsewhere on the web, a trail that sometimes led him to their legal names.'
'You find them on Facebook and see their personal pictures or whatever and it makes it more exciting,' David told Hill. 'It’s like the secret identity of Batman or Superman. You’re not supposed to know who this person is, they didn’t want you to know, and somehow you found out.'
The crisis is driven by a lack of corporate accountability and the "wild west" nature of facial recognition transparency:
Accessibility over ethics: Unlike Clearview AI, which is restricted to law enforcement, PimEyes is available to the general public. While the company’s Terms of Service prohibit searching for others, there is no technical mechanism to enforce this rule.
Transparency raradox: PimEyes claims it is a privacy tool designed to help people find where their own photos appear. However, its business model thrives on indexing as much data as possible, including sensitive sites, without seeking consent from the subjects of those photos.
"Whack-a-Mole" opt-outs: Accountability is limited because the "opt-out" process often requires users to submit their own biometric data to the company to be excluded, and even then, new images of the same person can reappear in subsequent crawls.
The incident raised questions about PimEyes' multi-purpose nature, the ease with which it can be used to identify and monitor third-parties, and about the quality and effectiveness of its governance. It also led to further calls for the system to be banned.
For adult performers: The "right to be forgotten" or to maintain separate identities has vanished. The risk of physical stalking and professional ruin has reached a "previously unimaginable" scale.
For society: This signifies the total erosion of public anonymity. It establishes a precedent where any stranger can snap a photo or video of a person in the street and immediately access their digital footprint.
Systemic risk: The commercialisation of such powerful AI suggests that unless strictly regulated, biometric data will continue to be used as a tool of control and harassment against marginalised communities and those in stigmatised professions.
Developer: PimEyes
Country: USA
Sector: Technology
Purpose: Identify individuals
Technology: Facial recognition
Issue: Accountability; Privacy/surveillance
AIAAIC Repository ID: AIAAIC1305