@zandbelt @rodhilton ... I like perplexity.ai , in addition to Google, yes it's an LLM and can't be trusted, you have to check results, but it provides sources to make that easier
Top-level
@zandbelt @rodhilton ... I like perplexity.ai , in addition to Google, yes it's an LLM and can't be trusted, you have to check results, but it provides sources to make that easier 4 comments
@adaddinsane @zandbelt @rodhilton ...the assumption is I think (which might be correct) that these tools will get better, ofcourse mistakes will remain but that's also possible with traditional search @ErikJonker @zandbelt @rodhilton @whole1day @zandbelt @rodhilton well perplexity annotates results with footnotes which are very often scientific papers |
@ErikJonker @zandbelt @rodhilton
If you have enough knowledge to check the result, fine.
But (a) people are lazy (proofs already exist of people who know but didn't check); and (b) what about the people who don't know any better?
And when these systems start eating their own garbage it will just escalate into utter insanity.