Gaggle student behavioural monitoring
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
Gaggle develops and sells software that monitors students.
Founded in 1999, Gaggle uses a combination of AI and human reviewers to police schools for suspicious and harmful content in order to prevent violence and student suicides by monitoring student email accounts, documents, and social media accounts.ย
System ๐ค
System info ๐ข
Operator: Minneapolis High Schools; Williamson County School District; Multiple
Developer: Gaggle
Country: USA
Sector: Education
Purpose: Monitor student behaviour
Technology: NLP/text analysis; Computer vision
Issue: Surveillance; Privacy; Accuracy/reliability; Appropriateness/need; Effectiveness/value
Transparency: Governance; Black box
Risks and harms ๐
Gaggle has been associated with multiple risks and harms, including the misinterpretation of context, privacy violations, and psychological damage.ย
Transparency and accountability ๐
Reports suggest Gaggle's systems have not been independently reviewed or audited.
Incidents and issues ๐ฅ
An investigation by non-profit news site The 74 revealed that the nature and extent of Gaggle's monitoring, which extended to almost every aspect of US students' lives, even after school hours and over weekends and holidays, resulted in student complaints, and prompted mental health and privacy advocates to raise their concerns publicly.ย
The effectiveness of Gaggle's suite of monitoring services was also questioned.ย