Gaggle student behavioural monitoring
Report incident π₯ | Improve page π | Access database π’
Gaggle develops and sells software that monitors students.
Founded in 1999, Gaggle uses a combination of AI and human reviewers to police schools for suspicious and harmful content in order to prevent violence and student suicides by monitoring student email accounts, documents, and social media accounts.Β
System π€
Gaggle π
Operator: Minneapolis High Schools; Williamson County School District; Multiple
Developer: Gaggle
Country: USA
Sector: Education
Purpose: Monitor student behaviour
Technology: NLP/text analysis; Computer vision
Issue: Surveillance; Privacy; Accuracy/reliability; Appropriateness/need; Effectiveness/value
Transparency: Governance; Black box
Risks and harms π
Gaggle has been associated with multiple risks and harms, including the misinterpretation of context, privacy violations, and psychological damage.Β
Transparency and accountability π
Reports suggest Gaggle's systems have not been independently reviewed or audited.
Incidents and issues π₯
An investigation by non-profit news site The 74 revealed that the nature and extent of Gaggle's monitoring, which extended to almost every aspect of US students' lives, even after school hours and over weekends and holidays, resulted in student complaints, and prompted mental health and privacy advocates to raise their concerns publicly.Β
The effectiveness of Gaggle's suite of monitoring services was also questioned.Β
Investigations, assessments, audits π§
News, commentary, analysis ποΈ
Page info
Type: System
Published: September 2021
Last updated: June 2024