Occurred: January 2021
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
A voice and facial recognition app claiming to score how trustworthy a user prompted privacy and human rights advocates to express arconcerned about the product's accuracy and propensity for bias.
Japanese software company DeepScore uses a 10-question survey to enable loan lenders, insurance companies, and financial institutions to decide in real-time whether people are lying or not with their gestures and tone of voice at a reputed accuracy rate of around 70 percent, and a 30 percent false negative rate.ย
Parity AI founder Dr. Rumman Chowdhury told Motherboard the app is at a 'minimum likely to discriminate against people with tics, anxiety, or who are neuroatypical.'
Others are concerned about DeepScore's privacy implications. Ioannis Kouvakas, a legal officer for Privacy International, told Motherboard he did not believe the company would be able to legally operate in the European Union due to the blocโs General Data Protection Regulation (GDPR).ย ย
DeepScore does not have a privacy policy on its website. It appears that many of its customers are located in countries with threadbare or non-existent privacy laws.
Page info
Type: Incident
Published: January 2023
Last updated: September 2023