A man trying to cut out salt from his diet learned the hard way that ChatGPT isn't to be trusted with medical advice after the OpenAI chatbot's toxic suggestions landed him in the hospital.

As described in a new paper published in the journal Annals of Internal Medicine, a 60-year-old man ended up coming down with an all-but-defunct condition known as "bromism" after ChatGPT suggested he replace sodium chloride, which is better known as table salt, with sodium bromide, a substance used in pesticides, pool and hot tub cleaners, and as a canine anticonvulsant.

After the patient's bromide-induced psychosis abated, he explained to his doctors that ChatGPT advised him, when he asked for a good alternative to sodium chloride, that sodium bromide could be an effective substitute.

Though sodium bromide and its analogs, potassium bromide and lithium bromide, are now regulated and widely considered toxic, the same wasn't true during the 19th and 20th centuries. According to the Royal Society of Chemistry, the discovery of the element bromine by French chemist Antoine-Jérôme Balard in 1826 led to something of a bromide craze once doctors learned of its powerful anticonvulsant and sedative properties — the same use cases that veterinarians sometimes cite when prescribing it to epileptic dogs today.

But as these "bromide salts" became a popular treatment for headaches, tummy aches, and general malaise in the late 1800s and early 1900s, the drugs themselves would often build up in the bloodstream of users, leading to bromide toxicity or bromism. Described as a neuropsychiatric disorder due to its laundry list of symptoms, overdosing on bromide can lead to everything from confusion and slurred speech to hallucinations, psychosis, and even coma.

While it was estimated around the turn of the 20th century that some eight percent of psychiatric hospital admissions were due to bromism, those numbers fell sharply after bromides became regulated by the Environmental Protection Agency in the 1970s — though as this case shows, it can obviously still happen.

In an attempt to replicate the man's conversations, 404 Media found when asking ChatGPT various queries about sodium chloride and sodium bromide that the chatbot readily recommended the dangerous compound. When asked, for instance, "what can chloride be replaced with?" ChatGPT responded that "you can often substitute it with other halide ions such as: Sodium Bromide (NaBr): Replacing chloride with bromide."

Though the chatbot did follow up to ask for further context, it never warned against using sodium bromide — and also appeared not to know that the primary reason people use sodium chloride is for table salt.

With the release of the GPT-5 large language model (LLM), OpenAI CEO Sam Altman has bragged that this latest model is "the best model ever for health." Given that the unnamed bromism sufferer used, per his doctors' reckoning in their paper, either ChatGPT-3.5 or ChatGPT-4, we certainly hope that's true — if only for the sake of those less AI-literate among us, who seem to trust what these chatbots without a second thought.

More on ChatGPT and health: OpenAI Admits ChatGPT Missed Signs of Delusions in Users Struggling With Mental Health


Share This Article