Email or username:

Password:

Forgot your password?
Top-level
Robotistry

@abreaction @failedLyndonLaRouchite @inthehands That is a much harder, more expensive problem than "improvements in AI" - the flaws in AI often boil down to "failure to correctly ground the concept in the real world".

The model hallucinates because without grounding that understands the concepts of "food" and "color" as subjective experiences, "blue" and "blueberry" are almost the same.

Robots *require* grounding to connect their actions to their task.

1 comment
DELETED replied to Robotistry

@robotistry @failedLyndonLaRouchite @inthehands

If the robots are doing a specific task, they can be grounded pretty easily with source data.

This is where ML excels: show it something done right 10,000 times and it can figure out roughly what "right" is, at least enough to do it right 99.99% of the time.

Go Up