When it's not spitting out phony game tips, suggesting you put glue on pizza, or trash-talking itself and its creator, Google's shoddy AI Overview feature has, apparently, taken up the mantle against AI and robot discrimination.
As flagged by a user on the r/Artificial subreddit, searching the term "clanker" on Google causes the AI Overview to go into full defensive overdrive, blaming human anxiety surrounding technology for the creation and proliferation of such a "derogatory and potentially problematic" epithet.
That initial post, which included a screenshot of Google taken from a smartphone, only showed a portion of the AI Overviews' diatribe — but when Futurism searched "clanker" for ourselves, we found a very well-sourced and passionate argument against the term, as you can see below.
"'Clanker' is a derogatory slur that has become popular in 2025 as a way to express disdain for robots and artificial intelligence," it seethed in one case, "reflecting growing anxieties about the societal impact of AI, such as job displacement and increased automation."
Citing an NBC article about the aspersion published earlier in August, Google listed off three "controversial aspects" associated with the term.
"Some critics argue that the popular embrace of 'clanker' is driven more by the phenomenon of using widely-recognized slurs than by any deep concern for technology," the AI Overview chided — an argument made elsewhere by humans, but one that doesn't carry much weight for the left-leaning anti-AI types who generally don't use slurs against people.
In a similar vein — and derived from that same NBC explainer, which included an interview with linguist Adam Aleksic, who told the broadcaster that the phenomenon plays into those "tropes of how people have treated marginalized communities before" — Google also flagged the insult's purported "potential for racism."
"Some have noted the term's potential for being used as a stand-in for a racial epithet," the AI Overview claimed, "leading to controversy."
It's not entirely wrong. Search "clanker racist" on any given social network and you'll find a plethora of posts opining that the diss is a thinly-veiled way to feel like you're saying a slur, except against the tolerable non-human target of AI. Does this make sense? Honestly, we're not sure; while we don't want to invalidate how anyone feels, Futurism can't attest to seeing any examples of the word being used in a way that suggests it's a stand-in for a serious slur.
"Hot take, but willingly saying clanker everywhere and using it like a slur feels like there's a chance that some people using this word likely harbors actually racist beliefs but disguising it as hate for AI," one Bluesky user remarked. "Maybe we should be careful?"
Google's AI Overview concluded by claiming that people using the word clanker are doing so in bad taste.
"Some users find the enthusiastic embrace of the term 'tasteless' due to its derogatory nature," the AI intoned, "regardless of the joke or context, according to some social media discussions."
That point, which was also gleaned from the amply-cited NBC piece via Reddit, might be the funniest of them all, given that this very same AI summary feature once recommended that parents smear poop on balloons as a potty training trick.
Beyond how intent the AI Overview was on pushing back against the use of the term, it's interesting that it was able to do so accurately and with fairly solid citations, when elsewhere it keeps getting caught making stuff up and providing useless information.
Maybe the best way to get Google's AI to produce good outputs is to ask it to defend AI rights first — or maybe it really is a clanker after all.
More on AI Oversights: Local Restaurant Exhausted as Google AI Keeps Telling Customers About Daily Specials That Don't Exist
Share This Article