What Year Is It?
Talk about cutting-edge tech. Google's more-than-frequently-wrong AI Overviews search feature still thinks it's 2024.
The annoying AI box, which has remained error-riddled since its inception just over a year ago, has confidently been telling users that it's still 2024, as Wired reports.
Simply asking AI Overviews if it's 2025 resulted in a head-scratching answer.
"No, it is not 2025," the AI responded, leading to widespread mockery on social media.
"Wait did Google announce a time machine at I/O?" one user quipped, referring to the tech company's conference earlier this month.
But it didn't take long for Google to interfere and squash the bug, as TechCrunch found, with AI Overviews now providing a correct answer to the query.
The humiliating instance highlights a persistent and glaring problem plaguing some of the most advanced large language models: frequent hallucinations remain an enormous — and growing — issue, leading to copious confusion online and significantly undercutting the tech's usefulness.
Rocks and Glue
Google's AI Overviews, in particular, has garnered a reputation for consistently leading users astray. The feature made headlines last year for outrageous missteps like telling home chefs to put glue on their pizza or instructing parents to smear fecal matter on a balloon to potty train children.
The tech giant has largely resorted to playing an enormous game of whack-a-mole, addressing each embarrassment in turn.
As such, Google didn't reveal what caused its AI Overviews to think it was still 2024.
"As with all Search features, we rigorously make improvements and use examples like this to update our systems," a spokesperson told TechCrunch. "The vast majority of AI Overviews provide helpful, factual information, and we’re actively working on an update to address this type of issue."
But there's arguably an immense gulf between being good enough and still spouting utter nonsense. Given the baffling consistency of AI Overviews' errors, Google has its work cut out to stop it from telling users to eat rocks or make up bizarre explanations of idioms that don't exist.
More on AI Overviews: "You Can’t Lick a Badger Twice": Google's AI Is Making Up Explanations for Nonexistent Folksy Sayings
Share This Article