Everalbum covertly uses personal data to train facial recognition system

Occurred: May 2019

Cloud photo storage company Everalbum used its users' photographs to train its facial recognition system without informing them or letting them turn the system off. 

According to an NBC investigation, the photos people shared with the app were used to train Everalbum Inc's facial recognition system, with the system then offered to private companies, law enforcement and the military customers under different names, including “Ever AI” and now “Paravision.” 

The company had initially used publicly available technology for basic facial recognition features but later developed its own algorithms using user images as training data.

Everalbum also stored user data indefinitely, despite telling users that their photos and videos would be deleted if their accounts were deactivated.

The finding sparked multiple privacy and civil rights organisations to accuse Everalbum and its brands of egregious privacy abuse and opaque and misleading marketing, and to a wave of complaints by its users.

Ever rebranded as Paravision days after NBC's report.

August 2020. Ever's cloud photo storage app closed, with increased competition from Apple and Google blamed by the company. 

January 2021. The furore prompted a January 2021 complaint (pdf) by the US Federal Trade Commission (FTC).

May 2021. Everalbum settled (pdf) with the FTC, with Ever instructed to all user data harvested from its app and to delete 'any models or algorithms developed in whole or in part' using that data. 

System 🤖

Operator: Everalbum/Ever AI/Paravision
Developer: Everalbum/Ever AI/Paravision
Country: USA
Sector: Consumer services
Purpose: Train facial recognition system
Technology: Facial recognition
Issue: Privacy; Surveillance
Transparency: Governance; Privacy; Marketing

Legal, regulatory 👩🏼‍⚖️

Page info
Type: Incident
Published: March 2022