VGG Face dataset used personal data without explicit consent from the individuals depicted
VGG Face dataset used personal data without explicit consent from the individuals depicted
Occurred: June 2019
Report incident 🔥 | Improve page 💁 | Access database 🔢
A widely-used facial recognition dataset created by researchers at Oxford University's Visual Geometry Group (VGG) created controversy for scraping images of peoples' faces on the internet without their explicit consent.
The use of such large-scale facial datasets for developing recognition technologies has sparked debates about privacy rights, surveillance and the potential misuse of facial recognition systems.
At no point did any individual whose personal details were collected provide consent or information about how they were being used, according to Adam Harvey at exposing.ai, raising the question as to whether the images of public figures should be available for any organisation or person to use as they see fit.
The dataset has since been removed from Oxford University's website.
Operator: ChaLearn; Chinese Academy of Sciences; Delft University of Technology; Simula Research Laboratory; University of Applied Sciences & Arts Western Switzerland; University of California, Berkeley; Universitat Autònoma de Barcelona
Developer: University of Oxford
Country: UK
Sector: Research/academia
Purpose: Develop facial recognition systems
Technology: Database/dataset; Facial recognition
Issue: Copyright; Ethics/values; Privacy
Transparency: Privacy
Harvey, A., LaPlace, J. (2019). Exposing.ai
Page info
Type: Issue
Published: July 2024