Email or username:

Password:

Forgot your password?
Top-level
Catherine Berry

@Gargron @darylgibson

A college professor of mine back in 1983 said "'AI' is what we call software we don't know how to write yet." I think this neatly captures the problem we have talking about current "AI". In 2000, nobody knew how to write software that would drive cars, write poetry, play grandmaster-level chess, or summarize text, so those were considered to be examples of what AI might accomplish. Now we know how to write systems that do those things, so they are no longer AI.

4 comments
Magnus Ahltorp

@isomeme @Gargron @darylgibson I agree with most of what you say, but in 2000 we knew how to write software that could play grandmaster-level chess and summarise text. And now we still don’t know how to write software that drives cars or write poetry.

Kydia Music

@ahltorp @isomeme @Gargron @darylgibson well, not *good* poetry, anyway. 😉

I weep for humanity that so many people have been impressed with the level of “art” these LLMs and generative art (pixel plagiarism) machines spit out. This is what happens when we fail to properly teach the humanities in school.

Catherine Berry

@KydiaMusic @ahltorp @Gargron @darylgibson

AIs aren't producing great art (yet), but they're easily outperforming the average human. I've seen a few AI-generated works that were quite compelling. As one of my favorite proverbs puts it, the amazing thing about a dancing bear is not how *well* it dances, but that it dances at all.

Kydia Music

@isomeme @ahltorp @Gargron @darylgibson true, but great art is partly defined by the fact that the average person *can’t* do it. Innovation and originality are often other factors that elevate art to greatness. And of course, meaning, motivation, and inspiration, which require sentience—and the ability to move others to feel something, which requires empathy for both the creator and the receiver.
There are lots of things the average person can’t do as well as a machine, like math calculations.

Go Up