Email or username:

Password:

Forgot your password?
Top-level
BΓ¨r Kessels 🐝 🚐 πŸ„ 🌱

@i0null I've spent way too long pondering this quote lately.

I think the problem sits in the premise. Is it true that "a computer can never be held accountable"?
And in that, what is "a computer", is it just the physical metal and silicium, the entire product, the entity renting it out, the entity using it, or owning it?
And is it truly "never"? As in: is this something dictated by fysics or the Order of Things, or can this change?

Like I said. Way too long.

12 comments
Hacker Memes

@berkes I sympathise with the epistemological uncertainty.

Kenneth

@i0null Smells like set theory to me. Or a really complicated flow chart. @berkes

Natanael ⚠️

@berkes @i0null If the computer misbehaves you can't punish it. And most of the time it misbehaves because somebody designed a component (or many) wrong.

Therefore you shouldn't rely on the computer to make decisions that can't be reversed or appealed by humans, because there needs to be a human higher in the hierarchy than the computer who can fix things.

BΓ¨r Kessels 🐝 🚐 πŸ„ 🌱

@Natanael_L @i0null seriously: why can you not "punish the computer"?

We can punish a car manufacturer when they forgot to put in brakes. We can punish an app developer when their app is harming us. We can punish a cloud provider when they leak data. And so on.

I feel it's far more involved than you state. What if I don't use "a computer" but instead a lambda on AWS? It would make sense the software developer is responsible when it makes a Wong decision?

Natanael ⚠️

@berkes @i0null you have the whole chain from collecting requirements to implementation to the operator of the system, the responsibility lies with whoever has contributed to the misbehavior (or opted not to fix it).

Whoever made the decision that software or hardware which they should have known wasn't ready should be put into production.

BΓ¨r Kessels 🐝 🚐 πŸ„ 🌱

@Natanael_L @i0null In that case "the computer" being a system, can be held responsible, no?

Natanael ⚠️

@berkes @i0null what will you do to hold it responsible? Yell at it? Power it down? Reprogram it?

And who will do so? And if you look closer, isn't whoever human that will enact this the actual person responsible for the machine's operation as they can override it?

mystixa 🎢🎧

@berkes @Natanael_L @i0null One can't 'punish a car manufacturer'.. one can add to the expense of manufacturing but that is just another of many that businesses deal with in every transaction.

llywrch

@Natanael_L @berkes @i0null Only humans higher in the hierarchy refuse to oversee the decisions algorithms make.

See almost any ban appeal in social media. Or complaint about bank handling charges.

BΓ¨r Kessels 🐝 🚐 πŸ„ 🌱

@llywrch @Natanael_L @i0null Certainly. But a counterexample were software developers and managers that went to jail in the VW scandal. So there is accountability for those writing software. Not much and not always. But not "never" either.

Haelus Novak

@berkes @i0null what of when/if AI is indistinguishable from humanity, even with so-called emotional response and distress? Is accountability linked to pain and consequence in that way? Maybe true for fax machines lol.

Jargoggles

@berkes @i0null
I like to think of it in more practical terms - a computer *shields* people from responsibility; it makes things fuzzier, harder to trace the decision back to whoever should be held accountable.

We've already seen this in action for a while now: "the algorithm messed up," "the algorithm did something unexpected," "the algorithm shouldn't have done that," the algorithm this, the algorithm that. It's a thought-terminating cliche and, unfortunately, it generally works on people.

Go Up