Email or username:

Password:

Forgot your password?
Siderea, Sibylla Bostoniensis

Honestly, the most alarming thing about AI isn't so much about AI itself, but about how utterly hell-bent humans are to use it for things that it does a bad job at. H sapiens is bound and determined to use this chisel as a screwdriver.

Case in point: the recent news story about a lawyer (or pair of lawyers - finger pointing is underway) who submitted a filing in federal court which had actually been written by ChatGPT.

What is the one thing one can assume everyone has heard about LLMs? That they make completely bogus shit up, including inventing nonexistent citations.

It would be hard to overstate how unacceptable to a court it is for a lawyer to submit a legal argument which cites nonexistent case law. That's the kind of shit that can get a lawyer disbarred. It's a, uh, *career-limiting* move.

But apparently some lawyer actually did it: he took the output of a computer program famous for fabricating false citations and piped it directly into a court.

1/?

8 comments
Siderea, Sibylla Bostoniensis

This is working out about as well as one would expect.

Now, in this case, the fool is hoist by his own petard, but, alas, the general tendency to use AI for things it is bad at is already racking up examples of it hurting third parties. The really obvious example of this is using AI in policing, where racist policing practice provides the training data for predictive models that successfully capture racism in the model.

I have the strong suspicion that there's actually a whole bunch of undetected examples in the wild, already fucking things up for everyone.

For instance...

2/?

This is working out about as well as one would expect.

Now, in this case, the fool is hoist by his own petard, but, alas, the general tendency to use AI for things it is bad at is already racking up examples of it hurting third parties. The really obvious example of this is using AI in policing, where racist policing practice provides the training data for predictive models that successfully capture racism in the model.

Siderea, Sibylla Bostoniensis

Most corporations these days use some sort of enterprise-class web-based software for handling job applications, and most or all of them offer some sort of applicant filtering, based on keywords or some such.

I'm pretty confident that at least some major vendors' products don't work right. That's based on two things.

First, at least in IT, the problem of highly qualified and desirable applicants being mysteriously filtered out by the software is so well known, that it's normal advice that job seekers should try to use social contacts within the org to make an end run around the filters.

Second, I've debugged software.

3/?

Most corporations these days use some sort of enterprise-class web-based software for handling job applications, and most or all of them offer some sort of applicant filtering, based on keywords or some such.

I'm pretty confident that at least some major vendors' products don't work right. That's based on two things.

Siderea, Sibylla Bostoniensis

I've worked on other sorts of enterprise class software. I have a reasonable sense of what kind of faults are harder and easier to debug, or even detect in the first place.

And I am telling you, if such an application somehow got into a state where any resume with the word "Java" on it got filed to /dev/null, there is a zero percent chance anybody would ever figure it out.

"All these applicants are crap! The ad clearly states 5+ years Java experience, but not one applicant even has *any* Java experience! Can nobody read?! Damn it, the FAANGs must have hired all the Java developers."

The reason this kind of bug will never be found is that there's such a steep power differential between, first, the developers of the software and their customers, and then again between the operator-users of the software (the hiring company) and the end-users (the job applicants).

4/?

I've worked on other sorts of enterprise class software. I have a reasonable sense of what kind of faults are harder and easier to debug, or even detect in the first place.

And I am telling you, if such an application somehow got into a state where any resume with the word "Java" on it got filed to /dev/null, there is a zero percent chance anybody would ever figure it out.

Siderea, Sibylla Bostoniensis

Nobody in a position to do anything is in a position to observe the signs something is wrong, and nobody in a position to observe the signs something is wrong is in a position to do something about it. The power differential guarantees that the people with the ability to do something will not listen to the people who have a problem to report. Indeed, they may have made it literally impossible for the people with the observations to contact them in any way.

I mean, if you thought you found a bug in a job application platform while applying for a job, how - where - would you even go about filling a bug report?

And all this is true even before introducing inscrutable, undebuggable AI to the platform. There is no way in which adding AI will make this better.

There's lots of places in our society in which computers lie like walls between two populations divided by a power imbalance. Those computer systems' faults are already largely undetectable for entirely social reasons.

5/?

Nobody in a position to do anything is in a position to observe the signs something is wrong, and nobody in a position to observe the signs something is wrong is in a position to do something about it. The power differential guarantees that the people with the ability to do something will not listen to the people who have a problem to report. Indeed, they may have made it literally impossible for the people with the observations to contact them in any way.

Siderea, Sibylla Bostoniensis

(I've already written at length about how this phenomenon is fucking up medical records and consequently medical science. Series starts here siderea.dreamwidth.org/1540620 , particularly see part 2 "Power's Deformation of Data".)

Bugs and other faults in such software are like mold growing in crevasses that sunlight can't reach, and which stagnant water never wholly evaporates.

So we can either assume that in these systems in which it's hard-to-impossible to detect faults, there are none to detect, or we have to assume there's probably undetected faults. Probably many, because they will accumulate.

Into these irremediable reservoirs of pestilential technological suckitude, AI is going to be - and presumably already has - been injected.

So that will be just ducky.

And that's why I say I suspect there's already systems screwing things up for everyone already.

7/?

(I've already written at length about how this phenomenon is fucking up medical records and consequently medical science. Series starts here siderea.dreamwidth.org/1540620 , particularly see part 2 "Power's Deformation of Data".)

Bugs and other faults in such software are like mold growing in crevasses that sunlight can't reach, and which stagnant water never wholly evaporates.

Siderea, Sibylla Bostoniensis

PS, I would be remiss if I didn't call out the Horizon Scandal, currently unfolding in the UK, as an example of how power relations X software breed bugs that then become social oppression and justice.

For those of you who hadn't heard, it's recently come to light that the UK postal system [pro|per]secuted some 700 of their own postmasters for theft and fraud which it has turned out to be entirely because of bugs in the accounting software the post office rolled out.

"More than 700 former Post Office staff were wrongly prosecuted for theft and false accounting in what has been described as "the most widespread miscarriage of justice in UK history"." (SkyNews)

"Some ended up in jail, others became bankrupt trying to repay money they did not owe, and a few even took their own lives before their names could be cleared." (SkyNews)

8/8

PS, I would be remiss if I didn't call out the Horizon Scandal, currently unfolding in the UK, as an example of how power relations X software breed bugs that then become social oppression and justice.

For those of you who hadn't heard, it's recently come to light that the UK postal system [pro|per]secuted some 700 of their own postmasters for theft and fraud which it has turned out to be entirely because of bugs in the accounting software the post office rolled out.

sarky

@siderea I think it really speaks to the truth of what you're saying that this scandal was first reported in parts of the uk press in 2009, but the post office was able to keep lying and covering it up for 20 years before it was really taken seriously

Captain Janegay 🫖

@siderea Horizon is a great example of the power imbalance situation you described earlier, too.

The customer support people for Horizon knew about the bugs for years but had no power to get the developers to investigate.

The postmasters (who are effectively franchisees) knew that the accounts Horizon produced were incorrect, but the system design meant that if they pushed for an investigation they would have to cease trading for weeks, which their business wouldn't survive.

Go Up