Email or username:

Password:

Forgot your password?
Top-level
⚛️Revertron

@ZySoua @skobkin Плюс качества партнёра. Например, он пластмассовый, или неотличим от человека?

11 comments
Alexey Skobkin

@Revertron @ZySoua
Предположим, что достаточно высокого качества, внешне с расстояния мало отличимо.
Но допуская, что, например, могут быть какие-нибудь законодательно установленные отличия вроде, скажем, какой-то отметки на теле/корпусе и, скажем, отсутствия крови и живой кожи, но не лишенной при этом сенсорики.

[DATA EXPUNGED]
Alexey Skobkin

@Revertron @ZySoua
Не смотрел.

Но если ссылаться на попкультуру, то за референсы можно брать, например, андроидов из вселенных:

- Terminator
- Star Trek
- Andromeda
- Detroit: Become Human
- etc

Для анимешников что-нибудь вроде:

- Ghost in the shell
- Plastic Memories
- SAO
- etc

Естественно, с учётом оговорок указанных рядом в треде - то есть чуть более приземлённая и потенциально возможная в будущем ситуация.

blau

@skobkin @Revertron @ZySoua Do you mean like in Blad Runner? Limited or not? I mean, in addition to the learning ability that can surpass the human brain, also unlimited creativity, differentiation between individuals (personal character), free will with morals or not randomly...

Alexey Skobkin

@blau @Revertron @ZySoua
I mean AGI (strong AI).
The same cognitive functions like human has or better.

blau

@skobkin @Revertron @ZySoua That is, cognitive empathy yes (theory of mind), but emotional empathy (emotions, feelings, sympathy, aversion...)?

Alexey Skobkin

@blau @Revertron @ZySoua
As I said here:
lor.sh/@skobkin/10944654813527

Consider it to be present. Otherwise the machine *probably* wouldn't be interested in that at all.

blau

@skobkin @Revertron @ZySoua For me great, a creative and sentient being. But I do worry if the morality will be his choice, or random luck in the making of him. I mean, if there are no Asimov's laws, then there would be no advantage between choosing an android over a human. It can be equally psychopathic in both cases.

Alexey Skobkin

@blau @Revertron @ZySoua
I see.

> if there are no Asimov's laws, then there would be no advantage between choosing an android over a human

Asimov laws were designed to be flawed from the start. So I'm not sure why they're so important.

> It can be equally psychopathic in both cases

Yes, like a human. The question is not about preferring an android over a human, but about equality of these choices.

blau

@skobkin @Revertron @ZySoua An android as a psycho can be so much more intelligent than a human psycho, so much harder to detect. He would be a single specimen as destructive as hundreds of humans without conscience. Asimov's laws do not create defective individuals. A psychopath is highly effective and he sees others as defective because of their moral limits, but he is defective, he lacks humanity. It is the dilemma between the instrumental reason about particular benefits and the reason regarding common values.

blau

@skobkin @Revertron @ZySoua Emotional empathy (emotions, feelings) they can't have I guess, only cognitive empathy?
Edited after translating. I'm sorry, I lost my writing without thinking.

Alexey Skobkin

@blau @Revertron @ZySoua
Suppose that the subject in question has an emotional empathy too. Consider it's not chemical-based, but emulated for example. It doesn't really matter.

P.S. Please use English if you can or at least properly set the language of your comment for translation to work.

Go Up