Roblox AI age verification system accused of misidentifying minors as adults
Roblox AI age verification system accused of misidentifying minors as adults
Occurred: January 2026
Page published: February 2026
Roblox’s AI-based age verification system was widely criticised for misclassifying users’ ages, undermining the safety of children and sparking outcry from parents, developers and child safety advocates.
In early 2026, Roblox rolled out a mandatory age verification system for users who want to access chat, using facial age estimation via selfie or, for some, government ID verification, with the AI system then assigning players to age bands that control who they can talk to.
Reports from media outlets, users, and parents describe numerous cases where adults are classified as teens and where children, including users around 10–15 years old, are labeled as 18+ or 21+ and given access to older age groups.
Players have shared videos showing that simple tricks - like drawing fake facial hair or wrinkles, using photos of celebrities, dolls, or 3D avatars - can fool the system into granting an adult classification to clearly underage users.
At the same time, some legitimate users are locked out of chatting with peers or with younger family members because the system wrongly put them in a different age bracket, disrupting normal social use of the platform.
The system was introduced after a slew of lawsuits and investigations accusing Roblox of enabling predators to contact children. Early evidence suggests the company has failed to fix these risks and may be creating new ones.
Roblox’s reliance on automated verification without sufficient human oversight or a robust feedback loop for errors created a "veil of scale," allowing safety failures to propagate across its massive user base.
Furthermore, corporate transparency limitations, such as untransparent appeal processes, left parents and children with little recourse when the AI system made incorrect determinations.
For those impacted, Roblox's failure can result in significant emotional and psychological distress, sexual and financial exploitation, and severe privacy invasions.
For society, it highlights the lack of safety inherent in Roblox's age verification system, and systems like it.
For policymakers, the incident highlights the need for meaningful legal reform, notably by holding platforms such as Roblox to the same due process standards as traditional legal systems, including clear transparency on how algorithmic decisions are made.
Developer: Paravision; Persona
Country: Global
Sector: Media/entertainment/sports/arts
Purpose: Verify age
Technology: Computer vision; Machine learning
Issue: Accountability; Privacy; Safety; Transparency
AIAAIC Repository ID: AIAAIC2191