It's because it's grounded in search. OP doesn't show where the link goes, it's to the Oreo Twitter account that says Oreo spelled backwards is Oreo. Regular search results does this too. The problem is the internet isn't truthful, and search overview results grounds on the internet.
Yup, AI Overview just attempts to give an overview of the top search results related to the users search query. The search results for OPs query are just a bunch of webpages saying "Oreo spelled backwards is Oreo."
Seems like people don't understand what the word "overview" means.
Overview- a general review or summary
Overview + Google search results = a general review or summary of the search results. 😂
To be fair, these kinds of questions (reverse a word, count the letters in a word, tell me which number is bigger) are just a very specific failure modes of LLMs due to the current tokenization scheme. It has nothing to do with accuracy on things like information retrieval or more important use cases. The labs haven’t addressed this problem yet because they think people are smart enough to know this and not ask the models for these questions, but obviously they aren’t.
People get shit wrong all the time and the internet is filled with incorrect data. I would say an AI based on that is probably going to make mistakes until it gets much smarter. That doesn't mean it isn't incredibly useful.
Companies release software products with bugs all the time. Heck, nowadays is like the standard for games to come out with lots of bugs. Even cars have a ton more issues on their first generations.
Haha the Google search AI overview should not be pulling from Facebook or threads unless the question is related to the platform or content on that platform. I also imagine the Google AI overview is a significantly worse model than Gemini pro. AI is expensive and these AI overviews run for free and automatically on every Google search that it can work on. This must be a crazy expensive feature for Google either way
I'd guess Google is using an even smaller/cheaper model than Gemini Flash.
Even though Gemini Flash is relatively cheap and fast, think about how many times people Google search each day. Adding Gemini 2.5 inference to each Google search likely would increase Google's cost of a search by like ....300% or something
Because this sub has devolved into nothing but people sitting around asking hundreds of ridiculous questions until they finally get a wrong answer so they can post here.
I wish there was a way to turn this "Feature" off. It really shouldn't be forced on users, especially since it's frequently wrong. Also, Google is supposed to be a search engine, not an answer generator.
Google's AI overview is wrong so often, yet I see a ton of people posting screenshots of it trying to use it as a source for their arguments. It's sad how many people blindly believe it just because it comes from Google. I've had to argue with people on how it's not a reputable source.
The llm used in googles search engine is EXTREMLY lightweight and such hallucinations are impossible to reduce unless Google wants to pay a fortune per search query
The other day, I searched for "Michael Ian Black Superman" and the AI overview said that "Michael B. Jordan will be playing the next Superman." Jesus H. Christ.....
71
u/username-invalid-s 2d ago