Email or username:

Password:

Forgot your password?
Top-level
cd ~

@adaddinsane @paninid Fact and fiction used to be very different categories. The lines are getting blurry these days, but not in this case.
The search result itself is fine and exactly what you would expect. I can determine the context by looking at the source. If I can't: My problem.
Condensing this fictional concept via AI without context making it sound factual is a clear error.

4 comments
adaddinsane (Steve Turnbull)

@cd_home @paninid

If it did that, yes. But it doesn't.

This is a standard Google result showing a matching source for the search terms supplied.

It doesn't "imply" anything, it's just the best matching response because the search terms are totally out-there.

I *hate* AI with a vengeance but claiming a standard result is "AI" is not helpful to anyone

cd ~

@adaddinsane @paninid OK, I see. I rarely use Google, so I missed the nuance here.
Don't you think the issue remains? What if I searched for election fraud? Which kind of articles would it condense into smth that looks like Google's answer to the question?

adaddinsane (Steve Turnbull)

@cd_home @paninid

Completely different issue, we always have to check our own sources using a search as a starting point.

In my opinion, this does really require a background understanding of the issue in the first place.

The usual thinking people have is that "everything on the internet is a lie unless it agrees with my preconceived ideas."

And there's not a lot we can do about that - except encourage better critical thinking.

cd ~

@adaddinsane @paninid Sure it's a different issue and I agree with all your remarks. I still think that pulling that content out of the page and putting it directly on top makes it look like "Google's answer" and will keep some from searching further.
And then we've gone full circle, because the willingness to do so probably increases with the AI-induced notion that we now get "answers" not food for thought in our results.

Go Up