@bontchev
This demonstrates the problem with all this "AI" stuff really well — it does even the things no one put into it, most of the time people like it because it allows chatbots to do stuff no one ever willingly taught them to, but it's not rare when it leads to unexpected behaviour like this, and what their "developers" do is basically working around these numerous corner cases to no end.