TikTok face filter bubbles accused of reinforcing bias, dividing users
TikTok face filter bubbles accused of reinforcing bias, dividing users
Occurred: February 2020
Page published: June 2025
TikTok’s face filter and content recommendation algorithms have come under fire for reinforcing societal biases, creating filter bubbles, and dividing users along racial, aesthetic, and ideological lines.
TikTok’s use of AI-driven face filters and personalised content feeds has led to the formation of “filter bubbles”—digital environments where users are predominantly exposed to content that aligns with their preferences, beliefs, or appearance.
Investigations and research show these bubbles not only reinforce confirmation bias but also amplify existing prejudices, including colorism and racism, by favoring certain beauty standards and marginalising others.
Harmful impacts include intellectual isolation, the perpetuation of unrealistic beauty ideals, damage to self-esteem (especially among young women), and the exclusion or objectification of marginalised groups.
This digital segregation risks deepening divisions within society and can lead to increased polarization and reduced exposure to diverse perspectives.
These issues stem from the design of TikTok’s algorithms, which optimise for engagement by learning and reinforcing user interactions.
As users interact with content and filters that reflect their own preferences or societal biases, the algorithm narrows future recommendations, creating echo chambers and reinforcing existing prejudices.
Additionally, face-altering filters often embody and perpetuate dominant beauty standards, which are frequently racially biased or colourist, further entrenching social divides.
The algorithms’ lack of transparency and their focus on maximising user engagement over diversity contribute to these outcomes.
For individuals, especially those from marginalised backgrounds, this means increased exposure to content that may degrade or exclude them, potentially harming mental health and self-worth.
For society, the widespread use of such algorithms risks entrenching social divisions, reducing empathy, and limiting the diversity of viewpoints in public discourse.
The persistence of these filter bubbles challenges efforts to foster inclusion and critical thinking online, highlighting the need for greater transparency, algorithmic accountability, and initiatives to diversify the content and beauty standards promoted on digital platforms.
Recommender system
A recommender system (RecSys), or a recommendation system (sometimes replacing system with terms such as platform, engine, or algorithm) and sometimes only called "the algorithm" or "algorithm",[1] is a subclass of information filtering system that provides suggestions for items that are most pertinent to a particular user.
Source: Wikipedia 🔗
For You
Developer: TikTok
Country: Global
Sector: Media/entertainment/sports/arts
Purpose: Recommend content
Technology: Recommendation system
Issue: Bias/discrimination; Safety
https://www.buzzfeednews.com/article/laurenstrapagiel/tiktok-algorithim-race-bias
https://www.vox.com/recode/2020/2/25/21152585/tiktok-recommendations-profile-look-alike
https://www.inputmag.com/culture/tiktok-lifts-the-cover-off-its-algorithm-data-practices
https://www.dailydot.com/irl/tiktok-fat-lgbtq-disabled-creators/
https://www.vox.com/recode/2020/6/23/21296056/tiktok-foryou-algorithm-explained-facebook-news-feed
https://slate.com/technology/2019/12/tiktok-disabled-users-videos-suppressed.html
https://www.telegraph.co.uk/technology/2020/02/28/tiktok-recommendations-may-based-gender-race/