Ukraine decision to use Clearview AI facial recognition draws concerns

Occurred: March 2022-

Can you improve this page?
Share your insights with us

Ukraine's decision to use Clearview AI's facial recognition technology to identify Russian soldiers, prisoners of war, and undercover saboteurs prompted concerns about what happens if the system makes mistakes.

Ukraine's vice prime minister Mykhailo Fedorov told Reuters that the country's defence ministry is using the controversial system. He also said it would 'dispel the myth of a 'special operation' in which there are 'no conscripts’ and ‘no one dies,' according to a message posted on Telegram by Federov noted by Forbes cybersecurity writer Thomas Brewster.

Despite broad support for Ukraine, human and civil rights experts are concerned about what happens if a person is wrongly identified, arrested or worse should Clearview's system make a mistake. In theory, mistakes are more likely given it will have to handle war-scarred faces. 

A 2019 US Department of Energy study concluded that facial decomposition considerably reduced the technology's effectiveness, though a 2021 conference paper was more promising.

Operator: Ministry of Defenсe of Ukraine
Developer: Clearview AI
Country: Ukraine
Sector: Govt - defence
Purpose: Identify Russian combatants
Technology: Facial recognition
Issue: Accuracy/reliability; Dual/multi-use
Transparency: Governance; Black box

Page info
Type: Issue
Published: March 2022
Last updated: February 2024