@SETSystems @JoeUchill I agree. Maybe "funny" is the wrong word.
12 comments
@SETSystems damn. That abstract is hopeful. Once kids are shown how LLM's are kind of bad with information they tend to trust them less, quickly. @lightninhopkins @SETSystems a lie is intentional. It's human. LLM's don't lie. It's just a program. If there is an intentional lie a person prompted the LLM to do it. @SETSystems "Currently the people creating some of these models are aware that the system has a capacity to "misrepresent itself". All of them know that it does. Maybe not the sales folks. My concerns are less esoteric. More immediate as LLM's trained on the internet are shoved into google. @SETSystems Its kinda funny when you are told to put glue on pizza or cook with gasoline. "Haha, funny LLM". It gets less funny fast when you have a depressed person asking about options. I went off on a tangent there. @lightninhopkins @SETSystems @lightninhopkins |
@SETSystems then again, better to try and get people to mistrust it now. I showed it to my kid. Critical Thinking.