@marcan Seriously, though. How in the slightest does that differ from what we, humanimals, do?
All we do is also take the behaviours, pieces of information (a.k.a memes) as sensory input, memorize it, train on it, transforming raw input into experience (it's called learning), combine the inputs, compare them, transform, recurse on it, many-many times, and then produce some output, which we then call "reasoning". Or "art" if nobody seems to buy into it. Or "culture" as umbrella term.
Have you seen the "Everything is a remix" series? This is true to an uncomfortable degree for some.
I'm fine with it. Whatever. There's no golden pot at the end of the rainbow, because a rainbow is not a bow, but actually a circle, and we're looking at it the wrong way. Everything that exists, works somehow. We do too.
@drq Just look at the failure modes to understand how it's different. There's no higher reasoning with current AIs. No common sense, no ability to solve novel problems even when the solution is obvious.
Maybe we just need deeper networks, who knows. But we're definitely not there yet, not anywhere close.