Email or username:

Password:

Forgot your password?
Top-level
Mikołaj Hołysz

@GossiTheDog @Quinnypig Quoting from the document:

"Data will not leak across workspaces. For any model that will be used broadly across all of our customers, we do not build or train these models in such a way that they could learn, memorise, or be able to reproduce some part of Customer Data."

Please stop spreading misinformation

12 comments
Euph0r14

@miki @GossiTheDog @Quinnypig

I would like to see what they mean with that.
Either they train with customer data or they don’t.

Currently it reads like “we are training with your data, but don’t worry, due to *magic* it won’t leak your information”

Darrin West

@Quinnypig @Euph0r14 @miki @GossiTheDog It might be possible to have a separate sandbox per customer. But. They won’t have a separate one per slack channel (too expensive). So. The assumption becomes broken that private channels (like HR), or that DMs will remain unavailable to all employees. Because LLMs leak their training data.

kfet

@obviousdwest @Quinnypig @Euph0r14 @miki @GossiTheDog This is just rage bite.

Not all AI is LLMs, and apparently Slack don't train LLMs on customer data.

toadjaune

@miki @GossiTheDog @Quinnypig they do gloss over a lot of details about how they achieve that, especially considering how it's a very hard problem, that is afaik pretty much an open research question.

I'm personally not gonna be risking any data I'm responsible for to such a very light promise.

Euph0r14

@toadjaune @miki @GossiTheDog @Quinnypig it is an open research question.

The only thing I know which achieves actual privacy guarantees is differential privacy, and that makes LLMs commercially unusable.

Florian Streibelt (mutax)

@miki @GossiTheDog @Quinnypig even if it stays within one workspace, DMs and private channels are a huge issue, don't you think? But of course, never would anybody discuss sensitive personal matters via DMs, right?
Going the way of using an opt-out is telling a lot. If all of that stuff is so great, people would happily opt-in, so why sneaking that in through the back?

Mikołaj Hołysz

@mutax @GossiTheDog @Quinnypig People never opt in to anything. You could tell most people that they can get a million dollars, no strings attached, and they'd just click whatever button strikes their fancy to close that popup as quickly as possible without even reading what it says. If it's an opt-in in the settings? Forget it.

Kevin Beaumont

@miki @Quinnypig where was my misinformation, out of interest? They’re training AI on enterprise customer data.

Aaron Rainbolt

@GossiTheDog @miki @Quinnypig I think because it sounded like they were going to train AI on sensitive, confidential data and then provide public access to that AI, which could leak secrets and whatnot between companies, whereas it sounds like it's specific to the company. Which... is still a nightmare, now secrets from management can leak down to other employees potentially.

Gabe Edwards

@GossiTheDog see threads.net/@aaronjmaurer/post. Slack is not training LLMs on messages or any other customer data. There are other kinds of ML models doing things like recommendations as described in slack.engineering/recommend-ap

Joe

@miki @GossiTheDog @Quinnypig Did you read the opt out language? It refers to the use of customer data to train global models. If any customer data at all are used to train global models, it is in there. They may make some effort to prevent leakage, but if your data are used to train a global model, clever prompting can cause it to be revealed. So I would suggest that any Slack users opt out, urgently.

Go Up