This part about relationship is what is actually going on. If you research history of psychology, this exact sort of "rationality", that people were rational - machine-like - was what dominated the field of psychology before the Holocaust.
It's like seeing history repeat itself. The reason the rich need to blur the line between human and machine is because it erodes a frontier where humans can be in solidarity with each other. If you are but complex machine, no better than ChatGPT, then into the gas chamber you go!
Bullshitters will say "the machines have feelings, they're as meaningful as you or me". The real impact of this is a change in perception that humans do not have (real) feelings.
From the article:
> They sat at a small table covered with a black cloth, Bender in a purple sweater, Manning in a salmon button-down shirt, passing a microphone back and forth, taking turns responding to questions and to each other by saying “I like going first!” and “I’m going to disagree with that!” On and on they went, feuding. First, over how kids learn language. Bender argued that they learn in relationship with caregivers; Manning said learning is “self-supervised” like an LLM. Next, they fought about what’s important in communication itself. Here, Bender started by invoking Wittgenstein and defining language as inherently relational: “a pair of interlocutors at least who were working together with joint attention to come to some agreement or near agreement on what was communicated.” Manning did not entirely buy it. Yes, he allowed, humans do express emotions with their faces and communicate through things like head tilts, but the added information is “marginal.”
https://nymag.com/intelligencer/article/ai-artificial-intelligence-chatbots-emily-m-bender.html