Email or username:

Password:

Forgot your password?
Top-level
Joseph Szymborski :qcca:

@evan @Gargron Ya, I think that's the heart of the question :)

What I'm trying to communicate is that when I ask an LLM "what is on the inside of an orange", the programme isn't consulting some representation of the concept of "orange (fruit)". Rather, it's looking at all the likely words that would follow your prompt.

If you get a hallucination form that prompt, we think it made an error, but really the LLM is doing it's job, just plausible words. My bar for intelligence is personally higher

1 comment
Go Up