Unconstrained College Students (UCSS)
Report incident 🔥 | Improve page 💁 | Access database 🔢
The UnConstrained College Students Dataset (UCSS) is a database comprising 16,000 photographs of approximately 1,700 students going about their lives at the University of Colorado, Colorado Springs, for the research and development of 'face detection and recognition research towards surveillance applications'.
The photographs were taken secretly on 20 different days between February 2012 and September 2013 using a 'long-range high-resolution surveillance camera without their knowledge,' according to Professor Terry Boult, the University of Colorado computer scientist who led the project.
The project was initially funded by the US government Office of Naval Research’s Multidisciplinary University Research Initiatives Program, and later by other US government entities.
Facial recognition system
A facial recognition system is a technology potentially capable of matching a human face from a digital image or a video frame against a database of faces.
Source: Wikipedia 🔗
Dataset 🤖
Website 🔗
Released: 2013
Availability: Available
Purpose: Train facial detection and facial recognition systems
Type: Database/dataset
Technique: Computer vision; Facial recognition; Machine learning
Documents 📃
Manuel Günther, Peiyun Hu, C. Herrmann, Chi-Ho Chan, Min Jiang, Shufan Yang, A. Dhamija, Deva Ramanan, J. Beyerer, J. Kittler, Mohamad Al Jazaery, Mohammad Iqbal Nouyed, G. Guo, Cezary Stankiewicz, T. Boult. Unconstrained Face Detection and Open-Set Face Recognition Challenge
Transparency and accountability 🙈
At the time, University of Colorado students had not been informed they were under surveillance nor were they told that images of them would be used to train military and intelligence agency facial recognition systems.
In addition, no infomation was provided as to how they could opt-out or have their photographs removed from the system.
Risks and harms 🛑
The Unconstrained College Students (UCCS) dataset is seen to have significant transparency limitations.
Lack of consent. Over 1,700 students and pedestrians were photographed using a long-range high-resolution surveillance camera without their knowledge or consent, raising ethical concerns about privacy and informed participation.
Unclear data collection process. While some details are provided about the camera setup and timing of photos, the full extent of the data collection and curation process is not entirely transparent.
Limited demographic information. The dataset lacks comprehensive information about the subjects' demographics, making it difficult to assess potential biases or representativeness.
Restricted access. The dataset has been temporarily suspended, limiting the ability of researchers to independently verify or analyse its contents.
Insufficient documentation. There appears to be a lack of clear guidelines or restrictions on how the dataset can be used, potentially leading to misuse or unethical applications of the biometric data.
Funding sources. The dataset's creation was primarily funded by United States defense and intelligence agencies, which raises questions about the intended uses and potential biases in the data collection process.
Lack of opt-out mechanism. The authors fail to provide any options for students to opt-out or be removed from the dataset, further compromising individual privacy rights.
Incidents and issues 🔥
Research, advocacy 🧮
Cheng Z., Zhu X., Gong S. (2018). Surveillance Face Recognition Challenge
Investigations, assessments, audits 👁️
Harvey, A., LaPlace, J. (2019). Exposing.ai
Murgia M., Financial Times (2019). Who’s using your face? The ugly truth about facial recognition
Page info
Type: Data
Published: February 2023
Last updated: October 2024