Fable AI reader summary tells user to read more white authors

Occurred: January 2025

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

An AI-generated reader summary by book app Fable sparked outrage by advising a Black user to "surface for the occasional white author" after reading mostly Black authors in 2024.ย 

What happened

Designed for book discussions and tracking, the Fable app used AI to create personalised year-end reading summaries for users.ย 

One summary described a user's reading habits as diving "deep into the heart of Black narratives" while suggesting they read more white authors.ย 

This offensive language was flagged by multiple users, revealing similar problematic summaries related to race, gender, sexuality and disability.

Why it happened

The controversy stemmed from Fable's use of AI to generate "playful" summaries based on users' reading data without adequate human oversight or content filtering.ย 

The company's head of product, Chris Gallello, admitted that their safeguards and offensive language filters had failed.

He also acknowledged the summaries had been generated by AI - something that had not been made clear to users.

What it means

The incident resulted in a user backlash, with many vowing to delete the app.ย 

The company apologised, implemented changes to its AI features and ultimately decided to remove the AI-generated summaries entirely.

More broadly, it highlights the importance of proper training, adequate human oversight and transparency of AI systems.

System ๐Ÿค–

Documents ๐Ÿ“ƒ

Operator:ย 
Developer: Fable
Country: USA
Sector: Media/entertainment/sports/arts
Purpose: Generate reader summaries
Technology: Machine learning
Issue: Bias/discrimination; Safety; Transparency