Email or username:

Password:

Forgot your password?
Top-level
Scott Jenson

@molly0xfff No question this could be a violation of privacy. But that violation occurs only if the data is removed from the device. There are amazing potential scenarios with this data, with the very important caveat that it doesn't escape the machine. However I completely understand that people won't trust that this caveat holds. It's just unfortunate that we can't have a discussion of how much potential is indeed possible.

But knee jerk reactions will never allow this discussion to happen.

15 comments
John Socks

@scottjenson @molly0xfff A likely scenario would be that it happens both ways. Perhaps first someone "innovates" sucking up too much personal data, but certainly later there will be an effort to do it all in a contained way. Both may be "successful."

The caution of the IoT age might be that most people will give up that privacy readily.

Entitas

@scottjenson @molly0xfff Thank god there's no way for data on my machine to leave my machine without me knowing about it. There are no means people can access my machine without my consent. This service is completely safe and absolutely nothing can go wrong with an unfiltered log of my passwords and financial data being stored on my machine. Think of the potential.

Scott Jenson

@TheEntity @molly0xfff 😀 I wanted to make that point as well but felt it wouldn't fit

Kent Pitman

@TheEntity @scottjenson @molly0xfff

The protection is that it's bad with numbers. So people will ask it what your finances are, but it will confabulate different numbers.

OK, I'm just kidding, and it's pretty darned dark humor, but it raises another key issue: We sometimes analyze the risk today by saying "well, there's no way to make use of that now" but then someone makes an unrelated change that means there is a way, and then we don't go back and re-review the things we've let through.

So if we did stupidly rely on how bad these LLMs are with numbers and decide it was OK to see our financials, and then someone fixed its math, we'd have a danger we'd already let through. And it might not be obvious that the thing creating the danger was "We fixed math."

@TheEntity @scottjenson @molly0xfff

The protection is that it's bad with numbers. So people will ask it what your finances are, but it will confabulate different numbers.

OK, I'm just kidding, and it's pretty darned dark humor, but it raises another key issue: We sometimes analyze the risk today by saying "well, there's no way to make use of that now" but then someone makes an unrelated change that means there is a way, and then we don't go back and re-review the things we've let through.

Entitas

@kentpitman i read this three times and have no idea what you're saying or how it's a reply to me

Kent Pitman

@TheEntity

Probably because I thought your response was sarcasm, not literal. :)

Most people are in the opposite position, where there are tons of ways their data can leave their machine and they never know.

Given current tech, even if an LLM were armed with my financial data, the present state of the art might not be able to actually regurgitate it. It seems terrible with math. But that can't hold.

I hope that clarifies it. If not, well, just ignore me. :)

Entitas

@kentpitman Here's a crazy idea: What if it wasn't an LLM getting your financial data off the Windows 11 Spyware App, but just some guy, who can read it and understand it and know how to use it to steal your shit.

Kent Pitman

@TheEntity

Oh definitely. So much more to be said on nuance but another time.

Salt Fish

@scottjenson @molly0xfff

Name a product where people had to be forced to use it to realize how amazing it was. The built in assumption is that product designers are these amazing geniuses and the end users are dummies. In fact AI products are terrible and people don't want to use them because they don't work as advertised.

Simon Lucy

@scottjenson @molly0xfff

'Remove'? Reading is sufficient to copy. How it's exfiltrated is left as an exercise for the reader. Rsync would be my favourite, but hiding it in noise is fine.

Ian Myers

@simon_lucy @scottjenson

I once had to point out to a corporate lawyer and a senior exec that if data hosted in Europe could be read on a screen in India then the data had been exported.

Simon Lucy

@TweekySenior @scottjenson

It still has to be explained.
And now a moment of silence whilst we contemplate MS Recall being sent over VDI and MS Recall on the client device capturing that every 5 seconds...

The Spoonless Kitchen

@scottjenson @molly0xfff IMO if you're trusting your computer to remember what you did, you've already lost The Game. See?

borstradamus

@scottjenson @molly0xfff what would those "amazing scenarios" look like?

Mr. E. Grey Seale

@borstradamus @scottjenson @molly0xfff
Github Runners. These specific machines are being pitched as AI compute clusters. Not end user devices.
Corporate security is quickly moving to zero-trust and that leaves the node in the employee's possession the least trusted. Staff will not be submitting binaries to repos. Compute will be done in the cloud; arm's-length from anything on your desktop

Go Up