Email or username:

Password:

Forgot your password?
Top-level
Matt Hodges

@evan @Gargron I think it's historically incorrect to say that, "technically calling it AI is buying into the marketing". Yes, marketing is capitalizing on it! But the nomenclature matches my CS education from the late 2000s and it matches 70 years of how "AI" is used in research and literature. The recent obsession with asserting "theory of mind" or "intentions" or "originality" or "real intelligence" seems, well, recent.

3 comments
Evan Prodromou

@MattHodges @Gargron I think there are a lot of things GPT4 is bad at. It's not very good at simple arithmetic. It is bad at geographical information -- what places are near others, parts of each other. It also does a bad job at string manipulation -- words that start with a particular letter, or words that are anagrams of other words. I don't think you have to resort to mysticism to say why it is not yet human-equivalent. But that doesn't mean it's not intelligent.

Matt Hodges

@evan

Yes, and...!

> It's not very good at simple arithmetic.

This is a recurrent example that is starting to illustrate the difference between bare LLMs and the products built on top of them. Eg, ChatGPT is a product built on top of a system. That system has a lot of components. One of those components is a LLM. And another component is a Python interpreter. LLMs can write Python quite well, and Python can do math quite well.

Seems like a pretty intelligent system to me!

A screenshot from ChatGPT showing that it uses Python to do math.
Go Up