Email or username:

Password:

Forgot your password?
Top-level
blau

@skobkin @Revertron @ZySoua Do you mean like in Blad Runner? Limited or not? I mean, in addition to the learning ability that can surpass the human brain, also unlimited creativity, differentiation between individuals (personal character), free will with morals or not randomly...

8 comments
Alexey Skobkin

@blau @Revertron @ZySoua
I mean AGI (strong AI).
The same cognitive functions like human has or better.

blau

@skobkin @Revertron @ZySoua That is, cognitive empathy yes (theory of mind), but emotional empathy (emotions, feelings, sympathy, aversion...)?

Alexey Skobkin

@blau @Revertron @ZySoua
As I said here:
lor.sh/@skobkin/10944654813527

Consider it to be present. Otherwise the machine *probably* wouldn't be interested in that at all.

blau

@skobkin @Revertron @ZySoua For me great, a creative and sentient being. But I do worry if the morality will be his choice, or random luck in the making of him. I mean, if there are no Asimov's laws, then there would be no advantage between choosing an android over a human. It can be equally psychopathic in both cases.

Alexey Skobkin

@blau @Revertron @ZySoua
I see.

> if there are no Asimov's laws, then there would be no advantage between choosing an android over a human

Asimov laws were designed to be flawed from the start. So I'm not sure why they're so important.

> It can be equally psychopathic in both cases

Yes, like a human. The question is not about preferring an android over a human, but about equality of these choices.

blau

@skobkin @Revertron @ZySoua An android as a psycho can be so much more intelligent than a human psycho, so much harder to detect. He would be a single specimen as destructive as hundreds of humans without conscience. Asimov's laws do not create defective individuals. A psychopath is highly effective and he sees others as defective because of their moral limits, but he is defective, he lacks humanity. It is the dilemma between the instrumental reason about particular benefits and the reason regarding common values.

blau

@skobkin @Revertron @ZySoua Emotional empathy (emotions, feelings) they can't have I guess, only cognitive empathy?
Edited after translating. I'm sorry, I lost my writing without thinking.

Alexey Skobkin

@blau @Revertron @ZySoua
Suppose that the subject in question has an emotional empathy too. Consider it's not chemical-based, but emulated for example. It doesn't really matter.

P.S. Please use English if you can or at least properly set the language of your comment for translation to work.

Go Up