Google Derm Assist dermatology app accused of racial bias

Occurred: April 2021

Google’s Derm Assist, an AI-powered app designed to analyse skin conditions, faced criticism over potential racial bias.

Derm Assist automatically analyses skin conditions, asks questions, and suggests causes. The app is being tested in the US and has been approved for use as a medical tool in Europe. 

Google said the app can recognise 288 skin conditions; however, some doctors expressed concerns about the accuracy of the system, in part due poor image quality, and the potential for over-diagnosis of skin cancers.

It also appeared the system was trained and tested on a dataset that underrepresents people with dark skin tones. The training dataset used for the app consisted of 64,837 images of 12,399 patients located in two US states

However, only 3.5 percent of these images came from patients with Fitzpatrick skin types V and VI, which represent brown skin and dark brown or black skin, respectively. The majority of the database was composed of people with fair skin, darker white skin, or light brown skin.

The findings resulted in accusations of racial bias, and drew attention to perceived real-life Google workforce discrimination and inequality. 

Google responded by saying it would only save images to help train the Derm Assist algorithm if users gave them explicit permission to do so. Concerns had also been raised about the potential for Google using personal sensitive data for other purposes.

Operator: Alphabet/Google
Developer: Alphabet/Google
Country: USA
Sector: Health
Purpose: Identify dermatological issues
Technology: Computer vision; Deep learning
Issue: Accuracy/reliability; Bias/discrimination - race; Privacy
Transparency: Governance; Privacy

Page info
Type: Incident
Published: December 2021