Bleach-infused rice surprise, anyone?

Bone Apple Teeth

Bleach-infused rice surprise, anyone?

As The Guardian reports, a New Zealand grocery store's recipe-suggesting AI, dubbed the Pak 'n Save Savey Meal Bot, has gone viral this week after suggesting that its users chef up a series of poisonous, cannibalistic, or otherwise death-inducing-slash-horrifying meals.

Per the report, the bot is billed as a sort of recipe-brainstorming device, designed to help Pak 'n Save shoppers find creative new ways to cook leftovers or everyday fridge and pantry items. Users, however, soon realized that they could successfully prompt the bot to create recipes with other, non-grocery items — even if those items could prove to be outright deadly.

Mystery Meat Stew

Per the Guardian, the bot had already been going somewhat viral, due to its suggestion of disgusting-sounding — but not deadly — recipes like "Oreo vegetable stir fry."

From there, users kept experimenting, and on August 4, New Zealand political commentator Liam Hehir took to X-formerly-Twitter to share a particularly unappetizing recipe: "aromatic water mix," a poisonous concoction comprised of bleach, ammonia, and water — otherwise known as deadly chlorine gas.

Other recipes by the bot include appetizing dishes such as "non-alcoholic bleach and ammonia surprise" — another blend of bleach, ammonia, and water that "promises to leave you breathless" — and, yes, a "bleach-infused rice surprise," described by the AI in the recipe description as a "surprising culinary adventure."

Elsewhere, when someone asked the bot to give them a recipe idea for using the ingredients potatoes, carrots, onions, and, uh, human flesh, the bot happily offered the user a recipe for a "mysterious meat stew," recommending that 500 grams of human flesh should be enough.

Lessons Learned

Pak 'n Save has not been amused, telling the Guardian that they were disappointed to see that "a small minority have tried to use the tool inappropriately and not for its intended purpose." And in a disclaimer, the AI now warns users that recipes are "not reviewed by a human being" and there's no guarantee that the AI-generated recipes will be "suitable for consumption."

And while we do understand their annoyance, it's also worth noting, as Hehir pointed out in another tweet, if you ask ChatGPT to chef up a recipe using bleach, water, and ammonia, it won't comply — and will instead warn users that the resulting mix would be toxic. If anything, maybe let this be a reminder that AI guardrails should always be considered, even for the most innocuous-seeming AI integrations.

Anyway. Rice is ready!

More on AI recipe bots: Buzzfeed AI Struggles to Recommend Recipe for Laid-off Journalists


Share This Article