YouTube Kids recommends adult content, advertising

Occurred: 2015-

Can you improve this page?
Share your insights with us

Google's launch of YouTube Kids was marred by a legal complaint (pdf) by the Campaign for a Commercial-Free Childhood (CCFC), a coalition of children’s and consumers advocacy groups about 'disturbing' and 'harmful' content. The findings led to accusations of poor algorithmic design, inadequate oversight, and systemic corporate irresponsibility. 

The complaint alleged that YouTube's content recommendation algorithm quickly exposed children to offensive and explicit sexual language, graphic adult discussions, jokes about paedophilia and drug use, the modelling of unsafe behaviours such as lighting matches, amongst other things. It also found that kids were being exposed to alcohol product advertising.

YouTube responded to the CCFC's legal complaint by saying 'We use a combination of machine learning, algorithms and community flagging to determine content in the YouTube Kids app. The YouTube team is made up of parents who care deeply about this, and are committed to making the app better every day.'

At its launch, product manager Shimrit Ben-Yair claimed YouTube Kids was the 'first step toward reimagining YouTube for families.'

Operator: Alphabet/Google/YouTube
Developer: Alphabet/Google/YouTube

Country: USA

Sector: Media/entertainment/sports/arts

Purpose: Engage children

Technology: Content recommendation system; Advertising management system; Machine learning
Issue: Safety; Oversight/review
Transparency: Governance; Black box; Marketing