Rio de Janeiro Oi facial recognition

Released: 2019

Can you improve this page?
Share your insights with us

The Rio de Janeiro state government carried out a facial recognition test project in 2019 with the aim of identifying criminals and preserving public order. 

Operated by telecoms company Oi, the project was divided into two phases: first, limited to Copacabana during the 2019 carnival, and second, in the Maracanã neighbourhood and around Santos Dumont Airport from June to October 2019.

Despite no missing persons being found, a 90% error rate for the facial recognition technology, and a number of mistaken arrests  made, the authorities declared the first phase a success, with five search and seizure warrants served, three arrest warrants issued, and three vehicles recovered. 

Copacabana mistaken identity

Police arrested a woman who was buying gold and silver on the streetside and who had been 'recognised' by its facial recognition system as Maria Lêda Félix da Silva, a criminal who had murdered her husband and was on the run from the police. The woman was taken to a police station as she did not have an identity document with her at the time of her arrest, and was later released.

Maracanã mistaken arrests

During phase two of the project, Rio police arrested eleven people at matches at the Maracanã stadium. Under pressure from O Panóptico, a project that monitors facial recognition use in Brazil, the authorities were forced to admit (pdf) that seven of these were false positives

Racial bias 

The Rio de Janeiro facial recognition project, and the technology's use across Brazil, has raised concerns about poor accuracy, racial and ethnic discrimination, and its disproportionate impact on marginalised communities. 

According to O Panóptico, 90.5% of the 184 people arrested using facial recognition in Brazil in 2019 were Black, and were largely detained for low-level crimes such as petty theft and robbery. 


From the start, the Rio facial recognition project was shrouded in mystery. Details of the contract with Oi,  which it turned out had been fined in 2014 for selling customer data without their consent - were conspicuously thin, as were details of the databases used to train and feed the system. In addition no information was provided as to how people were identified, for long their information was retained, or who was permitted to access the system.

Operator: Civil Police of Rio de Janeiro State
Developer: Oi; Huawei
Country: Brazil
Sector: Govt - police
Purpose: Identify criminals, preserve public order
Technology: Facial recognition; Automated license plate/number recognition (ALPR/ANPR)
Issue: Accuracy/reliability; Bias/discrimination - race, ethnicity; Privacy; Surveillance
Transparency: Governance; Black box