Email or username:

Password:

Forgot your password?
Top-level
datarama

@jernej__s @thomasfuchs It is! Ironically, while customer support was the originally envisioned "killer app" for chatbots, LLMs are actually *worse* at it than old-school chatbots were. Old-school chatbots don't hallucinate (and potentially mislead the customer) and they're not vulnerable to prompt-injection trickery (so you can't eg. get them to promise to sell you a car for 10 dollars).

2 comments
datarama

@jernej__s @thomasfuchs ...but you also couldn't get an old-school customer support chatbot to write you an algorithm that implements Floyd-Steinberg dithering in Python, so there's that.

Dragon-sided D

@datarama @jernej__s @thomasfuchs Most corporates that offer AI support bots are deploying RAG capabilities.

That basically solves the hallucination issue.

Go Up