Proctorio 'racist' facial detection

Occurred: April 2021

Can you improve this page?
Share your insights with us

College student Lucy Satheesan has discovered that online proctoring service Proctorio's 'proprietary facial detection' performs identically to an open-source computer vision software library OpenCV.

Facial recognition and detection software that have been built using OpenCV have previously been found to be biased; Satheesan tested OpenCV's models and found they failed to detect faces in images labeled as including Black faces 57 per cent of the time.

Proctorio uses facial detection to see if a student is looking away from their screen, leaves the room, or if there’s another person in the frame - any of which could indicate cheating.

The Electronic Privacy Information Center (EPIC) had previously filed a complaint (pdf) against Proctorio and four other online test proctoring services accusing them of unfair and deceptive trade practices. EPIC also said it is preparing to file a lawsuit unless they change their practices.

Satheesan told Vice 'They use biased algorithms, they add stress to a stressful process … during a stressful time, they dehumanize students.'

Operator: Miami University; University of British Columbia; University of Illinois
Developer: Proctorio
Country: USA
Sector: Education
Purpose: Detect faces
Technology: Facial detection; Machine learning
Issue: Bias/discrimination - race, ethnicity; Security
Transparency: Governance; Black box; Marketing