Slack has been siphoning user data to train AI models without asking permission

zohaibahd

Posts: 124   +1
Staff
Facepalm: For organizations, the specter of internal data being used to train AI models raises serious concerns around security and compliance. But Slack has still apparently been slurping up messages, files, and data to train its AI features behind the scenes. Even worse, users were automatically opted into this arrangement without knowledge or consent.

The revelation, which blew up online this week after a user called it out on X/Twitter, has plenty of people peeved that Slack didn't make this clearer from the jump. Corey Quinn, an executive at Duckbill Group, kicked up the fuss with an angry post asking "I'm sorry Slack, you're doing f**king WHAT with user DMs, messages, files, etc?"

Quinn was referring to an excerpt from Slack's Privacy Principles that reads, "To develop AI/ML models, our systems analyze Customer Data (e.g. messages, content, and files) submitted to Slack as well as Other Information (including usage information) as defined in our Privacy Policy and in your customer agreement."

Slack was quick to respond under the same post, confirming that it's indeed using customer content to train certain AI tools in the app. But it drew a line – that data isn't going towards their premium AI offering, which they bill as completely isolated from user information.

Still, most were caught off guard by Slack's main AI features relying on an open tap into everyone's private conversations and files. Several users argued there should've been prominent heads-up, letting people opt out before any data collection commenced.

The opt-out process itself is also a hassle. Individuals can't opt out on their own; they need an admin for the whole organization to request it by emailing a very specific subject line, which you can find in the post above.

Some heavy hitters weighed in, piling on the criticism. Meredith Whittaker, president of the private Signal messaging app, threw some shade, saying "we don't collect your data in the first place, so we don't have anything to 'mine' for 'AI'." Ouch.

The backlash highlights rising tensions around AI and privacy as companies rush to one-up each other in developing smarter software.

Inconsistencies in Slack's policies aren't helping, either. One section says the company can't access underlying content when developing AI models. Another page marketing Slack's premium generative AI tools reads, "Work without worry. Your data is your data. We don't use it to train Slack AI. Everything runs on Slack's secure infrastructure, meeting the same compliance standards as Slack itself."

However, the user data-mining admission within the "privacy principles" seems to contradict these statements.

Over on Threads, a Slack engineer has tried to clear things up, saying the privacy rules were "originally written about the search/recommendation work we've been doing for years prior to Slack AI," admitting that they do need an update.

Still, the bigger issue is obviously the opt-in-by-default approach. While common in tech, it runs counter to data privacy principles of giving people explicit choice over how their info gets used.

Permalink to story:

 
It needs to start somewhere - doesn't matter if it's here with Slack - these companies that use copyrighted or any content without the creator's permission is fined into the ground (literally, so the company has to dissolve). Bring down a couple of these companies and this sh!t will stop real quick.
 
Last edited:
By no means am I part of Slack, but I had to login to get things straight here and stop misinformation... This type of article causes paranoia among people and induces bias.

Privacy Policy of Slack hasn't changed since last year 2023. This post is THIS year, May 16 2024. Link here: https://slack.com/trust/privacy/privacy-policy#collect

Also, whether it's the writer of this article or Quinn, the Privacy Principles link does NOT reflect the statement in this article of 'To develop AI/ML models, our systems analyze Customer Data (e.g. messages, content, and files)...'. This is OVER Simplified.
Link here: https://slack.com/trust/data-management/privacy-principles

Last but not least, I bet Slack users were required to agree to this: https://slack.com/terms-of-service/user
 
By no means am I part of Slack, but I had to login to get things straight here and stop misinformation... This type of article causes paranoia among people and induces bias.

Privacy Policy of Slack hasn't changed since last year 2023. This post is THIS year, May 16 2024. Link here: https://slack.com/trust/privacy/privacy-policy#collect

Also, whether it's the writer of this article or Quinn, the Privacy Principles link does NOT reflect the statement in this article of 'To develop AI/ML models, our systems analyze Customer Data (e.g. messages, content, and files)...'. This is OVER Simplified.
Link here: https://slack.com/trust/data-management/privacy-principles

Last but not least, I bet Slack users were required to agree to this: https://slack.com/terms-of-service/user
 
Back