@blau @Revertron @ZySoua
As I said here:
https://lor.sh/@skobkin/109446548135278337
Consider it to be present. Otherwise the machine *probably* wouldn't be interested in that at all.
Top-level
@blau @Revertron @ZySoua Consider it to be present. Otherwise the machine *probably* wouldn't be interested in that at all. 3 comments
@blau @Revertron @ZySoua > if there are no Asimov's laws, then there would be no advantage between choosing an android over a human Asimov laws were designed to be flawed from the start. So I'm not sure why they're so important. > It can be equally psychopathic in both cases Yes, like a human. The question is not about preferring an android over a human, but about equality of these choices. @skobkin @Revertron @ZySoua An android as a psycho can be so much more intelligent than a human psycho, so much harder to detect. He would be a single specimen as destructive as hundreds of humans without conscience. Asimov's laws do not create defective individuals. A psychopath is highly effective and he sees others as defective because of their moral limits, but he is defective, he lacks humanity. It is the dilemma between the instrumental reason about particular benefits and the reason regarding common values. |
@skobkin @Revertron @ZySoua For me great, a creative and sentient being. But I do worry if the morality will be his choice, or random luck in the making of him. I mean, if there are no Asimov's laws, then there would be no advantage between choosing an android over a human. It can be equally psychopathic in both cases.