@ZachWeinersmith A lot of good answers already in the comments here.
Re: "there's objective data," LLMs don't know anything that isn't ingested into the training corpus, and have no sense of objectivity. All the problems with search exist with LLMs, except you have the additional problems of poisoned data that would be easy to spot as a human, "hallucination" (aka low confidence results being presented as high confidence) and data set biases that don't match the customer's expectations.
@ZachWeinersmith Websites lie about price all the time, but instead of price, say you ask it to find you spices that have the lowest levels of cadmium available. _Everyone_ is going to lie about this!