VRChat users’ avatars make sexual and violent threats against minors
Occurred: December 2021
Report incident 🔥 | Improve page 💁 | Access database 🔢
VRChat suffered from significant issues with abuse, harassment, and inappropriate content targeting minors, according to research conducted by the Center for Countering Digital Hate (CCDH).
The study found that users, including minors, were exposed to abusive behaviour approximately every seven minutes on the platform, including being regularly exposed to graphic sexual content and sexual harassment and being groomed to repeat racist slurs and extremist talking points. The platform was also found to contain threats of violence and content mocking sensitive topics such as the 9/11 terror attacks.
The researchers identified 100 potential violations of Facebook's VR policies in 11.5 hours of recordings.
The research highlighted serious concerns about child safety in VRChat and the broader Metaverse. The lack of moderation and enforcement of community guidelines created an environment where predatory behaviour and inappropriate content could flourish, potentially putting children at risk. Facebook/Meta was reportedly unresponsive to all reports of abusive content submitted by the researchers.
The findings underscore the need for better safeguards, moderation, and parental controls in virtual reality social spaces, especially those accessible to minors. The study served as a warning to parents about the potential dangers their children may face in these virtual environments.
System 🤖
Operator: VRChat
Developer: VRChat
Country: USA; Global
Sector: Media/entertainment/sports/arts
Purpose: Manage system safety
Technology: Machine learning; Safety management system
Issue: Safety