One of the biggest controversies linked to AI models has been the training phase where tech giants are said to scrap data online without taking consent.
Keeping in line with a similar theme, it’s shocking to learn how Slack has confirmed training its machine models with personal data belonging to users. It's being done without any permission and everyone opts in thanks to its default settings.
The company makes use of AI subfields to enable several in-app offerings like results through search, recommendations for channels, emojis, and even autocomplete features. But the word on this front has to do with how the company’s data files and messages belonging to users are being used without obtaining consent.
Yes, as daunting as it may sound, material keeps getting pulled out whether you like the sound of it or not. Customers are not even provided a heads up or opinion on if they would like to opt in as it automatically ensues.
As one can imagine, people are not taking the news well, especially those who wish there was some kind of heads-up given at the start.
Slack admits that users were never aware of the change but it’s still not altering the policy of training and that is really getting on many users’ nerves. Data collection has already been debated in the past and many agreed that it’s just not the best way to go about the situation as it strips users of privacy.
One top executive from DuckBill confirmed how the latest policy from Slack pertaining to AI training is just concerning for obvious reasons and you cannot just train a model from private data that users never released or gave consent to using.
DMs, files, texts, and more - it’s just infuriating in terms of what’s going on.
A host of posts were also directed in Slack’s direction in this regard by critics for greater clarification as the matter is a little shady to begin with. And that’s when the company finally broke the silence on what was really taking place.
Slack says it’s always been a part of its policy from day one where the Salesforce-owned firm is making the most of online content from users for training the app’s tools. It’s not giving them the chance to opt-out due to default settings. Similarly, it confirmed it wasn’t using data belonging to customers for the paid version of the popular generative AI tool.
Furthermore, it provided clarification on how workspace administrators or firms could generate requests to have messages or some content removed and not used for dataset training purposes.
Through X, the criticism against the company kept coming in from users and experts who felt it was just wrong and lawful to begin with. The growing negative criticism just goes to prove how the entire privacy mess is going to make the company suffer in the long run as people make a quick exit.
Meanwhile, the company’s reps are standing firm in their stance and why users cannot opt out of this privacy policy despite it being their right.
Slack continued to add how it does not create or embark on AI model training in a manner where it learns or recalls users’ data. All clients do have the chance to opt out of data training of certain models that have such capabilities like those that are not based on machine learning.
Some people are getting confused after seeing the company’s AI page. This talks about working without worrying as the user’s data solely belongs to them. It further reveals how it’s not used for AI training.
But Slack did admit how paid AI tools are not receiving training through this means but that does not mean other models are free from it. Different machine models are being fed users’ data so the misleading statement that user data belongs to them does not fit quite right.
Now when you come to think of it, the app is not the only one that is not giving users the chance to opt out. We’ve seen Squarespace work on a similar front and roll out features in 2023 pertaining to the training of models with data scrapped from the internet for some AI-powered crawling tools.
Image: Unsplash
Read next: X Finalizes Its Rebranding By Bidding Farewell To Its Old Domain Name
Keeping in line with a similar theme, it’s shocking to learn how Slack has confirmed training its machine models with personal data belonging to users. It's being done without any permission and everyone opts in thanks to its default settings.
The company makes use of AI subfields to enable several in-app offerings like results through search, recommendations for channels, emojis, and even autocomplete features. But the word on this front has to do with how the company’s data files and messages belonging to users are being used without obtaining consent.
Yes, as daunting as it may sound, material keeps getting pulled out whether you like the sound of it or not. Customers are not even provided a heads up or opinion on if they would like to opt in as it automatically ensues.
As one can imagine, people are not taking the news well, especially those who wish there was some kind of heads-up given at the start.
I'm sorry Slack, you're doing fucking WHAT with user DMs, messages, files, etc? I'm positive I'm not reading this correctly. pic.twitter.com/6ORZNS2RxC
— Corey Quinn (@QuinnyPig) May 16, 2024
Slack admits that users were never aware of the change but it’s still not altering the policy of training and that is really getting on many users’ nerves. Data collection has already been debated in the past and many agreed that it’s just not the best way to go about the situation as it strips users of privacy.
One top executive from DuckBill confirmed how the latest policy from Slack pertaining to AI training is just concerning for obvious reasons and you cannot just train a model from private data that users never released or gave consent to using.
DMs, files, texts, and more - it’s just infuriating in terms of what’s going on.
A host of posts were also directed in Slack’s direction in this regard by critics for greater clarification as the matter is a little shady to begin with. And that’s when the company finally broke the silence on what was really taking place.
Slack says it’s always been a part of its policy from day one where the Salesforce-owned firm is making the most of online content from users for training the app’s tools. It’s not giving them the chance to opt-out due to default settings. Similarly, it confirmed it wasn’t using data belonging to customers for the paid version of the popular generative AI tool.
Furthermore, it provided clarification on how workspace administrators or firms could generate requests to have messages or some content removed and not used for dataset training purposes.
Through X, the criticism against the company kept coming in from users and experts who felt it was just wrong and lawful to begin with. The growing negative criticism just goes to prove how the entire privacy mess is going to make the company suffer in the long run as people make a quick exit.
Meanwhile, the company’s reps are standing firm in their stance and why users cannot opt out of this privacy policy despite it being their right.
Slack continued to add how it does not create or embark on AI model training in a manner where it learns or recalls users’ data. All clients do have the chance to opt out of data training of certain models that have such capabilities like those that are not based on machine learning.
Some people are getting confused after seeing the company’s AI page. This talks about working without worrying as the user’s data solely belongs to them. It further reveals how it’s not used for AI training.
But Slack did admit how paid AI tools are not receiving training through this means but that does not mean other models are free from it. Different machine models are being fed users’ data so the misleading statement that user data belongs to them does not fit quite right.
Now when you come to think of it, the app is not the only one that is not giving users the chance to opt out. We’ve seen Squarespace work on a similar front and roll out features in 2023 pertaining to the training of models with data scrapped from the internet for some AI-powered crawling tools.
Image: Unsplash
Read next: X Finalizes Its Rebranding By Bidding Farewell To Its Old Domain Name