Email or username:

Password:

Forgot your password?
Top-level
Christine Lemmer-Webber

Years ago I ran into Gerry Sussman and he said "I'm not interested in that. I want software that's accountable." dustycloud.org/blog/sussman-on

Cheekily he said: "If an AI driven car drives off the side of the road, I want to know why it did that. I could take the software developer to court, but I would much rather take the AI to court."

I've thought a lot about that since. he may have been being cheeky, but he's right: accountability is important to all relationships. We should be designing our systems so that we assume accountability is critical. If you can't hold it accountable, don't act like it's this brilliant thing. Brilliant things are accountable for their actions.

1 comment
Christine Lemmer-Webber

Accountability means many things. The most obvious form is legal, taking someone to court, and the joke about taking the car to court might obscure the real message here. (Especially since, currently, taking a car to court is currently meaningless, which is part of why it's cheeky. Someone might misinterpret that as "allow companies to not be held responsible" and that's the opposite of what's being advocated here.)

The point is that in order for society to function, we need feedback loops, and accountability falls into them. Children are taught early on the story of "the boy who cried wolf". If you lie continuously, there's a consequence: people stop trusting you.

I haven't personally sued anyone, but I hold lots of people accountable in my life. If someone misbehaves and shows they can't correct their behavior, I don't need to keep them part of my social life.

But before we even get to that state, typically I might interrogate them: "why did you *do* that?" This might open a dialogue by which we can try to cooperatively rectify a problem.

Part of the problem is that even that is not possible under black box neural networks. Leilani Gilpin's dissertation research started pointing at how symbolic reasoning and neural networks might be integrated to add that kind of ability.

But at the very least, the *inability* to interrogate such a system is *not* an excuse, and certainly not for using such a system and introducing erronous or dangerous information and simply having no idea why. At that point, the accountability becomes *yours*... continued spreading of misinformation makes you, effectively, crying wolf in a way. It's not the fault of anyone around you if they trust what you say less.

Accountability means many things. The most obvious form is legal, taking someone to court, and the joke about taking the car to court might obscure the real message here. (Especially since, currently, taking a car to court is currently meaningless, which is part of why it's cheeky. Someone might misinterpret that as "allow companies to not be held responsible" and that's the opposite of what's being advocated here.)

Go Up