SARI ('Sistema Automatico di Riconoscimento Immagini') facial recognition
Report incident 🔥 | Improve page 💁 | Access database 🔢
SARI ('Sistema Automatico di Riconoscimento Immagini', or 'Automated System for Image Recognition') is a facial recognition system that enabled Italian police forces and sports authorities to identify criminals and suspected criminals.
SARI Enterprise was based on a database of 16 million mugshots, nine million of which belonged to people identified once by the police, while the other seven million were of individuals who had been stopped repeatedly. The system was piloted for eight months from 2017 by Italy's national police and rolled out in July 2018.
SARI Real-time used cameras installed in a particular geographical area and was capable of scanning individuals’ faces in real-time. These images were then compared with an Italian government watch-list database of up to 10,000 faces. The database was available to law enforcement upon request.
System info 🔢
Operator: Polizia di Stato; Udinese Calcio; S.S.C. Napoli
Developer: Reco 3.26
Country: Italy
Sector: Govt - police; Govt - immigration; Media/entertainment/sports/arts
Purpose: Strengthen law enforcement
Technology: Facial recognition
Issue: Accuracy/reliability; Dual/multi-use; Privacy
Transparency: Governance; Legal; Marketing; Privacy
Risks and harms 🛑
SARI facial recognition was accused of being inaccurate and of jeopardising human rights and personal freedoms.
Transparency and accountability 🙈
The SARI facial recognition system was seen to suffer from several transparency and accountability limitations:
Public information. The deployment of SARI at Italian borders was criticised for its lack of transparency, with limited public information about how the system operated, the data it collected, and how this data was used. This opacity made it difficult for the public to understand the scope and purpose of the surveillance, raising concerns about potential rights violations.
Accountability. The system's use in monitoring migrants and asylum seekers was described as dehumanising, with critics pointing out that there was little accountability for how the data is handled or how decisions are made based on the system's outputs. This lack of accountability can lead to abuses and discrimination, particularly against marginalised groups.