Proctorio found to use 'racist' algorithms to detect students' faces

Occurred: April 2021

Can you improve this page?
Share your insights with us

A facial detection system run by online proctoring service Proctorio has been found to be 'racist'. College student Lucy Satheesan reverse-engineered the exam software and discovered that Proctorio's software is using a facial detection model that failed to recognise Black faces 57 percent of the time. 

By assessing the code behind Proctorio’s extension for the Chrome web browser, college student Lucy Satheesan discovered that the file names associated with the tool’s 'proprietary' facial detection function were identical to those published by open-source computer vision software library OpenCV.

Tbe discovery appeared to confirm complaints by students that Proctorio's face detection system is inaccurate and unreliable. The company uses the technology to see if a student is looking away from their screen, leaves the room, or if there’s another person in the frame - any of which could indicate cheating.

The Electronic Privacy Information Center (EPIC) had earlier filed a complaint (pdf) accusing Proctorio and four other online test proctoring services of unfair and deceptive trade practices. EPIC said it is preparing to file a lawsuit unless they change their practices.

Operator: Miami University; University of British Columbia; University of Illinois
Developer: Proctorio
Country: USA
Sector: Education
Purpose: Detect faces
Technology: Facial detection; Computer vision; Machine learning
Issue: Bias/discrimination - race, ethnicity
Transparency: Governance; Black box; Marketing

System

Legal, regulatory

Research, advocacy

News, commentary, analysis

Page info
Type: Incident
Published: January 2023
Last updated: September 2023