Email or username:

Password:

Forgot your password?
серафими многоꙮчитїи

1864: "On two occasions I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."

2024: "We trained our AI on the cultural output of a white supremacist society; here's our plan to get unbiased results using simple prompt engineering"

14 comments
OldGeek

@derwinmcgeary no the data is based on class and the billionaires manipulation of the idea of race. When we try to enumerate and classify everything with some finality we end up knowing nothing.

Qybat

@derwinmcgeary I suppose in theory you could balance out the bias with careful manipulation of the AI, but it's going to take something a lot more sophisticated than adding a couple of words to the prompt.

Violet

@derwinmcgeary
Even the mere fact that "prompt engineering" is a thing now.....

I mean, when we said the 2nd most important skill as a programmer or admin is to be good at googling, that's not what we meant.

Nic

@derwinmcgeary problem is that the training input is subconsciously biassed and prejudiced. Most “diversity training” is about getting people to become aware of their subconscious prejudices and then (step two) to act to resist them. I don’t think this is possible with an AI trained on subconsciously prejudiced data. But I may be wrong...

серафими многоꙮчитїи

@nicbest this was really just a dunk on Google thinking they could just add the word "diverse" into queries as a fix for all of that biased input, but on a deeper level, the biased output of centuries of both systematic and unconscious discrimination isn't just the training data for some gizmos: we are living in it, and there is no trivial fix.

Having said that, someone did mention having a second bias-spotting AI in the training process as something that people do, which might get you somewhere, although I can't help imagining a room-sized compute cluster rolling its eyes at being sent to mandatory diversity training.

@nicbest this was really just a dunk on Google thinking they could just add the word "diverse" into queries as a fix for all of that biased input, but on a deeper level, the biased output of centuries of both systematic and unconscious discrimination isn't just the training data for some gizmos: we are living in it, and there is no trivial fix.

Nic

@derwinmcgeary as Marvin, in Hitchhikers Guide to the Galaxy might have said “Here am I, brain the size of a planet, and they make me....”

Tally P.
@derwinmcgeary When it comes to computer programming, I'd often heard the adage that computers are only as "smart" as their creators. And now with this AI business, I wonder if we're falling deep into the rabbit hole of the Dunning-Kruger effect shaping technology, too.
We're screwed.
Rowland Mosbergen

@derwinmcgeary I'm in hospital now with my son.

In the last 48 hours I've overheard:

- the Panadol that was written down as given was not given
- the cannula wasn't working so the bolus of morphine that they charted want actually given

How is AI going to handle the bad training data?

Denny Kozlov

@derwinmcgeary Back when I was first learning to code in the 1970s, we bandied about an acronym for this phenomenon: GIGO (Garbage In --> Garbage Out) 🤓

Go Up