@andrewgretton If you consider how knowledge and truth are actually arrived at, it's quite clear that LLMs are fundamentally incapable of achieving that kind of intelligence. No matter how advanced they get.
The best LLMs are simply better at defuzzing the source materials they've fuzzed. Which leaves the obvious observation that they're roundabout serving up the content they've stolen. Content we could just read directly...
@donnodubus for sure, and the reference source materials are more authoritative. However, consider the utility of an LLM that's "right" most of the time. If it can tell me - more quickly and mostly accurately - how to roast hazelnuts, the average internet user will prefer that experience to Googling/DDGing/etc which today is a toxic experience due to webspam etc.
It's utility over ethics; maybe Napster all over again? And we know what "won" for years until iTunes and Spotify won. For a while.