Email or username:

Password:

Forgot your password?
Top-level
Alexey Skobkin

@blau @Revertron @ZySoua
I mean AGI (strong AI).
The same cognitive functions like human has or better.

5 comments
blau

@skobkin @Revertron @ZySoua That is, cognitive empathy yes (theory of mind), but emotional empathy (emotions, feelings, sympathy, aversion...)?

Alexey Skobkin

@blau @Revertron @ZySoua
As I said here:
lor.sh/@skobkin/10944654813527

Consider it to be present. Otherwise the machine *probably* wouldn't be interested in that at all.

blau

@skobkin @Revertron @ZySoua For me great, a creative and sentient being. But I do worry if the morality will be his choice, or random luck in the making of him. I mean, if there are no Asimov's laws, then there would be no advantage between choosing an android over a human. It can be equally psychopathic in both cases.

Alexey Skobkin

@blau @Revertron @ZySoua
I see.

> if there are no Asimov's laws, then there would be no advantage between choosing an android over a human

Asimov laws were designed to be flawed from the start. So I'm not sure why they're so important.

> It can be equally psychopathic in both cases

Yes, like a human. The question is not about preferring an android over a human, but about equality of these choices.

blau

@skobkin @Revertron @ZySoua An android as a psycho can be so much more intelligent than a human psycho, so much harder to detect. He would be a single specimen as destructive as hundreds of humans without conscience. Asimov's laws do not create defective individuals. A psychopath is highly effective and he sees others as defective because of their moral limits, but he is defective, he lacks humanity. It is the dilemma between the instrumental reason about particular benefits and the reason regarding common values.

Go Up