This is a real result in Google.
The term #googling is going to take on new connotations, and they don’t care.
This is a real result in Google. The term #googling is going to take on new connotations, and they don’t care. 117 comments
@migmit @IntentionallyBLANK @paninid @abrokenjester @jbiserkov @migmit @paninid @abrokenjester @migmit @paninid @abrokenjester @migmit @paninid @abrokenjester Mostly, it's that I just don't work on that side. I'm on the grown up side. Other people have seen and interacted with the kid, so I'm reasonably certain he's neither a bot nor three monkeys in a trenchcoat. Not that I have anything against monkeys or trenchcoats. @IntentionallyBLANK woah nice! Care to share more details about it? How is the dynamics, sources, prompts etc @tomjennings @paninid if you search for "austria hungary space" a result showing a fictional description of Austria-Hungary in space is arguably just as accurate as independent factual results about Austria in space and Hungary in spcace. Ymmv. @abrokenjester @tomjennings @paninid The problem is the AI summary on top. The results below make it more or less clear that they are about a fictional game. The AI summary is misleading by leaving out any hint that it summarizes fiction. And that is basically the issue with "AI" summaries. They don't get the context. They are convincing sounding text manipulation without understanding. (Irrelevant fun fact: we did have a "real" colony in Asia for less than a decade.) @yacc143 @abrokenjester @tomjennings @paninid it's a "featured snippet", an excerpt from a relevant page. Google has had that for years, it's not the same as the new "AI Overview" bullshit. Plesse note that humans get these clues often from subtle things like in which website we find a certain text, potentially even the URL. Or what other links are offered on the page. So in a way the summary might be a perfect example of LLM AI art its best, being used totally wrong. It's a bit like my personal position on computer translations: they are generally not good enough to translate something and hand it out to unsuspecting victims. @abrokenjester @paninid The problem is that Google AI has repteatedly shown an inability to NOT post nonsense (or in this case from a game wiki) posts wholesale as search results. Yes, the result on top is from a Steampunk wiki. The problem is that it is on top, and from a steampunk wiki. @WhyNotZoidberg @abrokenjester @paninid if you search for a fictional concept, why wouldn't said fictional concept be the top result? Showing anything about modern Austria OR Hungary in space would be less accurate @jonoleth @abrokenjester @paninid Because 1. Not clearly marking it ON TOP as Fiction automatically spreads disinformation and anti-facts 2. I disagree. Even if the top post would just be a once sentence "The Empire of Austra-Hungary ceased to exist before space travel was made possible" 3. The search is without the "-" and without the word "empire" meaning it might as well be "Austria, Hungary In Space". In short, facts should always be on top, ads should be obliterated, and AI sucks. @WhyNotZoidberg @jonoleth @abrokenjester @paninid How do you wish, without AI, that Google classifies the internet into fiction and non-fiction? The screenshot isn't even of AI. It's of the knowledge graph. Noticed it doesn't say AI anywhere on it? Google has done this for years. @jenzi @jonoleth @abrokenjester @paninid The way desktop Google does it is better at least, it shows the source on top, which is the most important thing. As for classifying... Seeing the track record of public not exceptionally specialized AI (like in search for cancer cells) I wouldn't trust an AI classifying anything, to be honest. It would be worse than having it non-classified. @WhyNotZoidberg @jonoleth @abrokenjester @paninid Sorry you wanted the world's information to be classified into what's true and what's not true. It seemed simple when you suggested it. Disinformation is the biggest threat to humanity as we speak. Even if this particular search result wasn't a result of Googles abysmal AI, Google has admitted publicly that there is no way of making it stop making things up and present it as fact... and they are FINE with that. @WhyNotZoidberg Thanks for repeating that to me instead of backing your original position that Google should be the arbiter of truth. @jenzi @WhyNotZoidberg No one said that in this thread except you . What everyone here claims is that without a clear source, there is no way a user can distinguish between truth and fiction. Google now presents results as paragraphs that looks like they are valid answers but there's no way of knowing if they are true or not. It is not Google who has to be the arbiter of truth. It is the user. Google is removing that possibility. @jenzi @WhyNotZoidberg @jonoleth @abrokenjester @paninid And no, Google hasn't done this for years. The fact that you are confusing the summary some search engines do taking information from reliable sources like Wikipedia with this kind of output only proves further the point that this kind of AI is not suitable/well trained for this use case @WhyNotZoidberg @abrokenjester @paninid this is less "Google's new AI initiatives suck" and more "Google should do way more than they ever have or even could do reliably" @jonoleth @WhyNotZoidberg @abrokenjester I find it interesting that the original post didn’t mention or reference “AI” at all…that was just y’all being irritated by something that was implied 🤷🏻♂️ @paninid @WhyNotZoidberg @abrokenjester I was thinking of this post https://mastodon.social/@WhyNotZoidberg@topspicy.social/113062021359578409 but that's a fair point. Guess AI is just hot on everyone's mind when it comes to Google's many screwups right now. @jonoleth @paninid @abrokenjester Heh. Definitely. AI is NFTs on steroids, but it infects everything, not just techbro wallets. @paninid I had to try this myself, and indeed! Who needs political serial liars if you have Google?!? @paninid Wait, they are presenting the world of the Space: 1889 tabletop RPG as if it was real? LLMs have no concept of reality, only of text. So they can't really distinguish between reality, satire and fiction. @paninid This is a great find. I get a better result from Perplexity AI - better in the sense it does not mention the steampunk fiction - but it also mixes things in a bit weird way. I suppose the problem with both 'engines' do not have ability to realize the human context (theory of mind?). They are mostly dumb search results combinators. @paninid the Austrians will certainly not dispute this story but point to it, whenever anybody asks. @paninid I think the results are accurate, since austria-hungary existed from 1867 to 1918. @Gilgamesch @paninid so the user has to adapt to the technology in order for it not to lie? Sound like something that is pretty useless. @toriver @paninid But the technology doesn't lie in this particular case. Austria-Hungary obviously never went to space, so it gives you a search result from a very popular, thus very "SEO-friendly", steampunk fantasy site. @toriver @paninid For example this is the answer google gives me, but I'm not logged into my google account. Plus I'm using privacy tools like ublock, cookie destroyers and noscript. A search result can be influenced by a lot of things. But these are the parameters one has to know, if using a technology. @alexshendi @toriver @paninid Well maybe, but in this case no. Look what I get when I just use a conjunction. @Gilgamesch @toriver @btuftin @toriver Of course they aren't. That's why you have to educate people. And I'm not saying that it's not getting worse with "AI". It is, because you don't have access to the source of the "knowledge". If you were a member of that Steampunk fandom you'd be super happy with that search result. 😄 Seriously though, that's just a page snippet from the highest ranking match for that query. It's not from their AI engine. @paninid IMHO Google should be used as a keyword search engine, not a fact checker. If fantasy is written, fantasy is what will be found. @paninid search for "2001 in space" and you mostly get stuff about some old movie ... regardless of the search engine. @patterfloof @paninid but Google is not inventing anything here? It's giving results from a fictional work that features Austria-Hungary in space, which seems to me the most logical kind of results for that query. @Ash_Crow @patterfloof @paninid Indeed. There are many things to criticize in Google but, here, there is nothing wrong. @paninid looks to me like a totally normal result of a query about anything fictional. Its not trying to pass it off as reality as the nature of the quoted sources makes quite clear. A problematic result would be if their 'AI' results tried to list this as the first mission into space without context. This is not an AI response, this is a match to your query. Austria-Hungary never had an actual space program, but there is a steampunk fiction match. It gave you exactly the right response - you can literally see the source. It's right there. Yes, Google AI overviews are generally utter shit, they do not supply the source. And this is not that. *You* need to learn the difference. @adaddinsane @paninid Fact and fiction used to be very different categories. The lines are getting blurry these days, but not in this case. @paninid I used to work on this search feature at Google. I don't work at Google anymore. There is a line between "information retrieval machine" and "wish fulfillment machine" that Google Search crossed some time ago. There were too many incentives (growth, more user eyeballs if you tell people what they want to hear or entertain them) for it not to. @paninid For the many, many people responding that this is a good top match for the query, try entering it in DuckDuckGo and see what happens. The terms are broken out, and you get a string of matches for recent Austrian space initiatives and Hungarian astronauts. That's what I would expect from a search engine. If I want to peg it to the 19th century or the Austro-Hungarian Empire, there are syntax tweaks I can use to do that, but that should be on me. @paninid@mastodon.world that or someone is going to retroactively try to claim territory on Mars @paninid@mastodon.world This is really just a bad query. It returns a fitting result, though not to something you want. @lauren (who has deep connections inside Google and has worked with them) has been lambasting these "AI" summaries in search results since Google first added them. I'm with him; they are utterly useless, because even if they were correct, you can't just trust them to be so - you need to go back to the original sources and validate it yourself. In which case, why bother with the summaries in the first place? Just show the original sources as the search results. I'm shocked that my Austrian school education kept that secret!! Especially since Austria-Hungary was portrayed quite positively. The vibe was "Due to evil nationalism, those other countries no longer wanted to be governed by us and destroyed our great Danube Monarchy diversity project :-( " I didn't learn anything about the horrible poverty in Galicia, for instance... @paninid @kimlockhartga Absolutely love the “you’re using it wrong” responses. You could certainly say the Google is using LLMs wrong by applying them to the entire internet and returning responses without context. @paninid Reminds me of the time GPT gave the population of Mars based on one of the Expanse books. This is probably going to really wind up some online conspiracy theorists. Some of that stuff ends up being mostly benign, or at least kind of contained, but there's no real predicting it, so it would definitely be better if google wasn't throwing this fuel on the fire. @paninid Google = bullshit some ridiculous explanation about any subject leaving near to no doubt about its inexactitude It's actually pretty crazy that the USA only made it to the moon 50 years later and then made such a big deal about it. 😂 @paninid @jwildeboer things like this and others have made that I want to get rit of Google and Android as soon as possible. @maestrapaladin Sorry to be That Person® here but I'm pretty sure the Galicia here refers to the Polish-Ukrainian one, which actually was part of Austria-Hungary. @paninid@mastodon.world oh it took me a minute to catch what was wrong with the result @paninid Well... Yes, it's a summary from the Steampunk Space Wiki... A *fictional* thing. Do I think it's a bit confusing if you don't know what you're looking at? Yes. Yeah, #Google isn't as good as it used to be. My first choice for #WebSearch is #DuckDuckGo . @paninid Google AI told me that Google AI will solve this problem before the Hitler clones in Antarctica are released. This is not AI, it is a correct response to a ludicrous query. Strangely enough, the Austro-Hungarian Empire, roughly 1867-1918, like the *rest of the f-ing world*, did not have a space program. But clearly there is a steampunk story that matches the search criteria, and that' what you got. I'm all for bashing LLMs but *you (a human?)* need to be able to distinguish an AI response from a proper query match. Hint: LLMs don't cite sources, this does. |
@paninid I'm all for bashing Google, but in this case I'm not sure what you're complaining about to be honest. The result seems spot on given what you asked for.