Email or username:

Password:

Forgot your password?
Eugen Rochko

It’s hard not to say “AI” when everybody else does too, but technically calling it AI is buying into the marketing. There is no intelligence there, and it’s not going to become sentient. It’s just statistics, and the danger they pose is primarily through the false sense of skill or fitness for purpose that people ascribe to them.

350 comments
Marc Khoury

@Gargron Do you have an example of what being intelligent *would* mean?

I have a hard time understanding the "it's not intelligence because we can tell how it happens" argument. Or are you making a different one?

The applications of this tech can replace human work which currently requires intelligence. That seems like artificial intelligence to me.

What is your argument other than the usual receding definition of artificial intelligence as "computation that we can't explain"?

Richard W. Woodley NO THREADS 🇨🇦🌹🚴‍♂️📷 🗺️

@Gargron
AI is based on harvesting a lot of data, if it is correct data it is knowledge but it is still not intelligence. Quiz shows have made the general public confuse knowledge with intelligence.

Kevin Jardine

@Gargron LLMs are more sophisticated versions of the old ELIZA chatbot. At least ELIZA's creator Joseph Weizenbaum did not attempt to overhype what it was.

en.m.wikipedia.org/wiki/ELIZA

DELETED

@kevinjardine @Gargron

That's very neat, Mr. Jardine.

Well desperate times, call for desparate marketing. Just look at what happen to the delivery industry - restaurant delivery happened for years and because of the app boom, it's almost like restaurant delivery never happened prior. The same goes for many apps that overlap existing services but with an app and heavy data collection.

It's almost like, dare I say, this is another outlet for data collection. Which it most defintely is.

CoolBlenderKitten

@Gargron
Not the only danger, it's also turning human work and creativity into a seemingly unnecessary commodity.
The effect of which is already easy to see and guess, it will make things worse for all of us, whether you lose your job or need support in any form or like to enjoy art or....

javier_paredes

@Gargron
Large Language Models are to intelligence what elevators are to teleportation.

Mark Crowley

@Gargron It's not the best term for sure, but there is clearly increasing complex reasoning of different kinds happening over the past 10 years or so.

Marcus

@Gargron I've been telling people this forever and they don't want to hear it. They're so enamored with the science fiction of "AI" that they don't want to believe that these companies will lie for money.

King of Red Lions

@Gargron
Are you saying the "revolutionary artificial intelligence technology" in this toothbrush is just marketing? That there isn't an Nvidia GPU inside analyzing my teeth in realtime? oralb.com/en-us/products/elect

Charles J Gervasi

@Gargron I think AI is a real discipline, but now people are calling any program they want to hype "AI".

Sean Eric Fagan

@Gargron I try to avoid it when talking about LLMs.

Lord Caramac the Clueless, KSC

@Gargron Well, those ML models can do a few things which are part of the biological phenomenon known as intelligence--pattern recognition and pattern synthesis. ML models are basically maps of the billion-dimensional vector space of the probability of the occurrence of patterns. ML is basically an analogy of rote memorisation, just repeating information until it burns itself into your memory. People tend to think of me as much more intelligent than I really am just because I can recite a huge volume of information from memory. ML models by themselves are not intelligent at all, but they can mimic human intelligence by perfectly reproducing the patterns they have seen in the data produced by humans with human intelligence.

However, if we ever build anything resembling real, actual AI, even if it never becomes more intelligent than a mouse, ML models will be a large part of it. However, in oder to become really intelligent, a system needs to be an autonomous agent in its environment, being subject to environmental pressures, having a memory and a model of self within the environment, and being able to constantly learn from new data during each interaction with the environment, but also from internal simulations (dreams, fantasies). And this is also why nobody is really seriously working on any humanlike AI at this moment--because in order to let it become intelligent, you need to give the system autonomy, turn it into an autonomous agent, and let it learn completely automatical and unsupervised. You know what happenes to unsupervised self-learning chatbots? 4chan finds them and teaches them how to Nazi. All experiments which really have a chance of creating actual intelligence IMHO need some kind of robotic body, although a really good game world with a good physics engine might work as well. You cannot have intelligence in a box, all you get is a box that can analyse data for you or make synthetic data from noise. Intelligence needs a body, intelligence needs a world in which to experience itself interacting with other intelligences. All the image models and language models are important for it to be able to see and hear, but that's just the preprocessing before the real intelligence begins.

@Gargron Well, those ML models can do a few things which are part of the biological phenomenon known as intelligence--pattern recognition and pattern synthesis. ML models are basically maps of the billion-dimensional vector space of the probability of the occurrence of patterns. ML is basically an analogy of rote memorisation, just repeating information until it burns itself into your memory. People tend to think of me as much more intelligent than I really am just because I can recite a huge volume...

Ampelios

@Gargron Yes. Anyone who has ever complained about autocorrect should know better than to think "AI" is anything but more of the same.

RyeNCode

@Gargron I agree,
I've been using the words Statistical Model, but in my head it's much closer to "Sparkling Weighted Random Number Generator"

Canadian Crone

@Gargron AI has no intelligence. It’s a processing system.

Daan

@Gargron I think you’re pulling a giant “No true Scotsman” fallacy here. I agree that ChatGPT is not going to become sentient (anytime soon). But I think it’s reductive and an oversimplification to claim that there is no intelligence there. Not in the least because we don’t even know what intelligence is exactly.

What your Toot is also doing - as are the myriad of ignorant replies - is severely underestimating the usefulness of ChatGPT and related technologies.

Daan

@Gargron It looks to me that people claiming that ChatGPT is “just parroting” or equating it to “data center level auto complete” have not used it in any anger and are downplaying the productivity such tools can bring.

Interestingly, many of the replies in this thread show a severe lack of human intelligence 😂

jordan

@Gargron I've come across "virtual intelligence" and think I like that terminology more. The program is used to generate something which appears to have qualities of "intelligence" but is only a simulation of such. In in a similar vein, I think "machine learning" is also a little too anthropomorphic.

Matthew Merkovich :clippy:

@Gargron Which is why I only refer to it as "large language models" or less often "machine learning," when it applies to VFX work in film and TV.

Jiří Fiala Total Landscaping

@Gargron I'm absolutely not an AI fanboi but we may have to reframe the conversation. Who says humans aren't statistics machines as well? It's not like earth shattering thoughts just pop into our minds without massive amounts of training, observation and mimicry.

BitcoDavid #FuckTrump

@Gargron Remember that "beeyonaire" who owns the Republican judge? Harlan Crow? Well, being a lifelong Dickens fan, I realized Dickens himself couldn't come up with a better name. So I fired up my AIs and asked them to write a "Dickens" paragraph describing the character. They failed. So now I say, artificial--perhaps, but intelligence--not so much.

DELETED

@Gargron

** S E A S O N ' S G R E E T I N G S **
~ Admiral Founder Rochko! ~

I think "auto-data" is better coining, or in general, like ChatGPT, a " generated data interplay (data-play)" might be the proper minting of the input exchange.

Good points. Furry Trunk Salute!

Pine

@Gargron Oh, the irony. Aren't you just parroting Emily Bender? A chatbot could have said this. How can we know that what you say is more than just your cognitive statistics?

1. There is intelligence in current LLM based AI. A different sort, but still intelligence. Language competence without comprehension.

2. Most of what people say is pretty much at the level of parroting.

3. What you say is half true, half misleading.

Several people on this thread have mentioned this sort of idea.

Son of a Sailor

@Gargron I wish these bots would stop being personified. Statistical databases don't have personalities.

marty

@Gargron
i say LMA (large model algorithm) and if people get confused by it they will ask, opening the door to me explaining how "ai" really is just a large model algorithm.
if they dont understand it, refuse to acknowledge that i am talking about the same thing and continue to talk about "ai" i know enough about their technical competence about the topic.

cultdev

@Gargron probably the single keenest and clearest articulation of this point that i’ve seen yet. no wonder you’re the mastodon guy

Michael Honey

@Gargron sure, that’s true of 2023-era LLMs. But what about their descendants? We’re very early in this journey

Cris Neubauer

@Gargron I'd argue that it is artificial intelligence, because intelligence doesn't necessarily have to include sentience or sapience. Sure they may go hand in hand, but intelligence without sentience or sapience can only reproduce or even restructure information, but cannot create or understand it.

DELETED

@Gargron I had a similar opinion for a while, and still prefer ML, LLM, etc. But as long as Artificial General Intelligence (AGI) and Artificial Sentience are considered separate things … I’m becoming somewhat ok with the term AI as a catch-all for some of this new tech.

David W. Jones

@Gargron
I always read "AI" in the news as "Artificial Idiocy." Although the terms "Augmented Idiocy" or "Amplified Idiocy" just came to mind.
@etherdiver

Andrew C.A. Elliott

@Gargron

I've taken to calling them 'text/code completers' and what they do as 'text/code completion'.

Botahamec

@Gargron There's an old saying that if you don't know how something works, it's AI. Once you know how it works, it's an algorithm. For what it's worth the algorithms taught in an Intro to AI class are usually much less complicated than an LLM

moondog548

@Gargron thank you dear god thank you.

It's like we need to be spending just as much time debunking the "ai" label as we do explaining all the problems with what they're using it for. 😫

Luisf

@Gargron

Plagiarism engines.

Sparkling autocorrects.

"Machine learning"

statistical models. Fuck em.

Michael Edwards

@Gargron

On these issues I miss the wise words of the Ada Lovelace Foundation. Did they get lost in the Great Migrations?

kfet

@Gargron Oh my, what a deafening echo chamber! AI is not AGI. LLMs are not AGI, and nobody paying attention claims or believes they are. AI is very real, it is pretty much everywhere, it is incredibly useful, and at the same time dangerous, today.

Cavyherd

@Gargron

Maybe coopt the acronym. Instead of calling it "artificial intelligence," insist on calling it something "applied inference," or something.

arjankroonen

@Gargron But to be fair, it is doing a better job at pretending to be intelligent than tons of humans voting nowadays… so I’m not sure I really really care ablut the broad interpretation of “I” being used.

Paulo

@Gargron always thought a better term would be "extreme fitting models"

mosher

@Gargron I recommend we start calling AI "Behavioral Statistics" as they model statistically likely behavior. Plus the abbreviation is easily understood, even by someone who doesn't know what it stands for.

Rafi C

@Gargron I just went to a major US marketing conference in Sept, every single vendor was touting the 'AI-fication' of their products. It's ridiculous because we are customers, we know that they've just rebranded the ML stuff they were already doing!

Christian Pietsch 🍑

Exactly, @Gargron! That's why I came up with Weizenbaum's Law:

“Whenever something is called Artificial Intelligence, someone is trying to fool you.”

digitalcourage.social/@chpiets

#AI #ArtificialIntelligence #WeizenbaumsLaw #dontCallItAI

Reiner Jung

@Gargron essentially it is often used to bullshit. So AB for artificial bullshiting would be much more fitting or automatic bullshiting. I think the last is the best option.

Robert M. Kelly

@Gargron Disagree. Calling it "artificial intelligence" is absolutely justified. It's only when the qualifier "artificial" is dropped, unknowingly or uncareingly, that interpretive problems arise.

Troed Sångberg

@Gargron You are also "just statistics" - according to all the science we have on the subject.

Felix Denbratt

@Gargron well put!
What we have today is limited to Machine Learning

Hoi_Pollois

@Gargron Okay. What do you recommend we call it?

Tomislav Hengl

@Gargron how our brain works it is also statistics in a way, but yes biological beings are still at the order of magnitude more complex than some 2023 super computer. FYI we wrote a little blog that reviews the key conceptions and the key misconceptions: medium.com/mlearning-ai/ai-tec

abracadabra holmes

@Gargron word. It's not AI. It's software written to steal your *everything*

Go Up