@david_chisnall I think the unfortunate challenge is that some people do not want tools, they want servants or slaves. LLM holds that prospect for them.
What happens when a lowly servant misinterprets the intent of a command? They are blamed for it irrespective of ambiguity.
@tim @david_chisnall
“some people do not want tools, they want servants or slaves. LLM holds that prospect for them.” - YEP.
… and that’s why businesses are “ecstatically (re)placing” Human Intelligence-HI(adaptability) with unethically sourced and environment destroying Artificial Intelligence-AI.
Related - https://mastodon.social/@dahukanna/113741679088044261