Email or username:

Password:

Forgot your password?
Top-level
Arina Artemis :nonbinary_flag:

@quantumg @devopscats What I mean is, they don't understand that there's a boat and a goat. It is not translated into concepts the way a human mind would.

1 comment
Trent Waddington

@arina @devopscats we don't have any idea how a human mind constructs concepts. Let alone any particular human mind. LLMs do indeed translate words into "concepts" and we know this because their internals can be interrogated. If we give it a few different sentences about goats there will be similar vectors in the different computation. What's more, we can intervene, changing the goat vectors to look more like cat vectors and the output will be cat-related. Sentences about goats eating socks will become about cats eating mice, even though we provided it nothing about mice, because it has encoded in it the more likely relationship.

@arina @devopscats we don't have any idea how a human mind constructs concepts. Let alone any particular human mind. LLMs do indeed translate words into "concepts" and we know this because their internals can be interrogated. If we give it a few different sentences about goats there will be similar vectors in the different computation. What's more, we can intervene, changing the goat vectors to look more like cat vectors and the output will be cat-related. Sentences about goats eating socks will...

Go Up