With multiple mental health-related deaths already linked to ChatGPT, it now looks like the abysmal medical advice spat out by the OpenAI chatbot may end up claiming another life.
In an interview with the Daily Mail, a 37-year-old named Warren Tierney explained how he went from a concerned and doting husband during his wife's difficult pregnancies to his own stage-four cancer diagnosis after the chatbot told him that his increasingly severe sore throat wasn't cause for concern.
Hailing from the tiny town of Killarney in Ireland's County Kerry, Tierney said he began asking the chatbot earlier this year if he should be worried about his persistent sore throat, which eventually became so bad that he couldn't even swallow fluids. Could it be possible that he, like his dad before him, had cancer?
Per screenshots shared with the Mail, ChatGPT maintained that it was "highly unlikely" that Tierney, a trained psychologist, had cancer. During those exchanges, Tierney even updated it as one would their real doctor that his esophageal pain had improved enough for him to swallow a cookie after he began taking blood thinning medications, which ChatGPT said was a "very encouraging sign."
Thanks to the "systemic male belief" that he didn't need to consult with a flesh-and-blood physician, the young dad went along with the chatbot, and even returned to ask it for more advice once the pain got bad again, the Mail reports.
"Fair play — if it’s cancer, you won’t need to sue me — I’ll write your court affidavit and buy you a Guinness," it wrote at one point.
As the Irish Independent reports, Tierney was in for a shock once he finally did go to the doctor a few months after his symptoms worsened when he was diagnosed with stage-four adenocarcinoma of the esophagus, which is associated with very low survival rates — we're talking in the sub-two-digits — due to being detected so late in its progression.
Staring down such a heavy possible prognosis from his expensive hospital bed in Germany — which his wife, Evelyn Dore, has raised more than $120,000 to fund — Tierney told the Mail that ultimately, his misplaced trust in ChatGPT "probably cost [him] a few months" of life.
"I think it ended up really being a real problem," the ailing father admitted, "because ChatGPT probably delayed me getting serious attention."
"The AI model is trying to appeal to what you want it to say in order to keep you engaged," Tierney mused. "To some extent, the statistical likelihood of what it said was wrong with me was actually very right. But unfortunately in this particular case, it wasn't."
In response to the Mail's request for comment about Tierney, OpenAI maintained that its flagship chatbot is "not intended for use in the treatment of any health condition, and is not a substitute for professional advice." As the regretful husband and father noted, he's a "living example" of what happens when users don't heed such disclaimers.
"That's where we have to be super careful when using AI," Tierney told the Mail. "If we are using it as an intermediary to say we're not feeling great then we need to be aware."
"I'm in big trouble because I maybe relied on it too much," he concluded. "Or maybe I just felt that the reassurance that it was giving me was more than likely right, when unfortunately it wasn't."
ChatGPT and other chatbots like it are, as Tierney acknowledged, known for their dangerous sycophancy, or pushing what users want to hear to keep them engaged. In the past few years, that propensity has resulted in imprisonment, hospitalizations, self-harm, and recently a murder-suicide. It was also caught spewing terrible medical advice when, as per a recent case study in the journal Annals of American Medicine, the chatbot suggested an older man replace table salt with so-called "bromide salts," an archaic medicine used to treat aches and pains around the turn of the century that is now found in pesticides and pool cleaners.
In that man's case, he took so much that he developed what's known as "bromism," an all-but-extinct neuropsychiatric disorder that causes confusion, slurred speech, and psychosis — though after detoxing from the toxic compound during a three-week-hospital stay, the journal paper notes, he was ultimately fine.
More on men and health: An Astonishing Number of Men Are Dying Because They Refuse to Go to the Doctor
Share This Article