Meta/Facebook 'aware of' Instagram impact on teen girls' mental health

Occurred: September 2021

Facebook was aware for years that teenagers blame Instagram for increased levels of anxiety and depression, made little effort to address the issue, and played it down in public, according to leaked documents. 

Internal research, presentations and emails by the social network obtained by the Wall Street Journal revealed Facebook (since renamed Meta) knew its photo-sharing app Instagram had had harmful effects on many of its young users, particularly teenage girls. Over 40 percent of Instagram’s users are 22 years-old or younger.

The documents spurred US Senators Richard Blumenthal and Marsha Blackburn to say they 'will use every resource at [their] disposal to investigate what Facebook knew and when they knew it.'

Facebook CEO Mark Zuckerberg responded by saying that he did not reckon 'the research is conclusive' on the extent to which social media impacts childrens' declining mental health. 

According to Forbes, the social network has refused many requests from members of Congress to share its research on children’s mental health, arguing it is 'kept confidential to promote frank and open dialogue and brainstorming internally.'

Operator: Meta/Facebook  
Developer: Meta/Facebook
Country: USA
Sector: Technology
Purpose: Moderate content
Technology: Content moderation system
Issue: Ethics; Hypocrisy  
Transparency: Governance; Black box; Marketing; Legal

Research, advocacy 🧮

Investigations, assessments, audits 🧐

Page info
Type: Issue
Published: September 2021
Last updated: June 2024