Hamburg G20 Summit protests facial analysis database legality

Occurred: July 2018

Can you improve this page?
Share your insights with us

Hamburg police's use of facial recognition software to investigate crimes during the protests against the G20 summit in July 2017 encroached on the fundamental rights of bystanders and other uninvolved people, according to the city's Commissioner for Data Protection and Freedom of Information (DPA).

At the centre of the row was a database of facial images of over 100,000 people collected by police during the summit from static and mobile video surveillance cameras, as well as from private photographs and videos taken during the demonstrations. 

The images were stored for an indefinite time period on hard drives at the Hamburg police department, and could be compared with images of known criminals and suspects using Videmo 360 facial recognition software.

In July 2018, Hamburg's DPA told the police that there was 'insufficient legal justification for the biometric analysis of faces that could justify such intensive encroachments on fundamental rights of the large part of bystanders and other completely uninvolved persons.'

However, in October 2019 an administrative court in Hamburg declared that the DPA's order to delete the database had been illegal. The Hamburg police refused to delete the database, and continued to make the case for the use of automated facial recognition for major future events in the city.

Operator: City of Hamburg
Developer: Videmo

Country: Germany

Sector: Govt - police 

Purpose: Identify criminals

Technology: CCTV; Facial recognition
Issue: Privacy; Surveillance

Transparency: Governance