Slack forces users to opt-out of training its AI models

Occurred: May 2024

Slack came under fire for using user messages and files to train its AI models by default, without explicit user consent. 

Knowledge-sharing company Slack was found to have started training its AI models on its customers' data and information, without telling them or seeking their consent.

In an undated statement on privacy principles quietly posted overnight to its website, Slack recommended that users wishing to opt-out of its AI training programme must do so by emailing the company, raising concerns about the use and misuse of potentially sensitive corporate and personal information

In response, Slack said that user data would not be shared with third-party providers for training purposes and that customer data never leaves the platform. The company also updated its privacy principles in an attempt to better explain the relationship between customer data and generative AI in Slack.

Developer: Salesforce/Slack
Country: Global
Sector: Multiple
Purpose: Predict search results; Recommend channels
Technology: Machine learning
Issue: Confidentiality; Privacy; Security
Transparency: Governance; Marketing