After firing its entire human staff and replacing them with a chatbot, Vice reports that an eating disorder helpline has already announced that it's bringing its humans back.

And yes, as it turns out, it's because replacing a human-managed crisis helpline with an AI-powered chatbot went extremely, extremely poorly. Who could've thought?

As NPR first reported last week, the nonprofit National Eating Disorder Association (NEDA) — the largest eating disorder-focused nonprofit in the US, according to Vice's initial report on the debacle — had decided to entirely disband its heavily-trafficked crisis helpline in favor of a human-less chatbot called Tessa, just four days after its human workers had unionized. Humans were supposed to stay online to field calls until June 1, when Tessa was scheduled to take over as NEDA's only interactive resource.

But that all changed when Sharon Maxwell, an activist, sounded the alarm that Tessa was offering wildly unhelpful — and even suggested what suggest behaviors associated with disordered eating.

"Every single thing Tessa suggested were things that led to the development of my eating disorder," Maxwell wrote in a viral social media thread, posted to Instagram on Monday. "This robot causes harm."

In her harrowing Instagram post, Maxwell recounted that Tessa urged her to lose up to two pounds a week, explaining that the activist should regularly weigh herself, restrict certain foods, and aim to cut her caloric intake by 500-1,000 calories per day. In other words, a chatbot entrusted with giving advice to people with eating disorders ended up promoting disordered eating.

Maxwell's experience doesn't sound like an outlier, either.

"Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder," wrote psychologist Alexis Conason in an Instagram post sharing screenshots providing similar advice as Maxwell received.

"To advise somebody who is struggling with an eating disorder to essentially engage in the same eating disorder behaviors, and validating that, 'Yes, it is important that you lose weight' is supporting eating disorders," Conason told The Daily Dot.

In a gross turn, NEDA — which had previously emphasized that because Tessa wasn't ChatGPT, it couldn't "go off the rails" — first went on the defensive, with the company's communications and marketing VP Sarah Chase commenting "this is a flat out lie" on Maxwell's Instagram carousel. Per the Daily Dot, Chase deleted the comment after Maxwell sent her screenshots.

NEDA has since taken down the bot, writing in its own Instagram post that Tessa will be offline until an investigation is completed.

"It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program," NEDA wrote. "We are investigating this immediately and have taken down that program until further notice for a complete investigation."

NEDA CEO Liz Thompson, meanwhile, assured Vice that "so far, more than 2,500 people have interacted with Tessa and until yesterday, we hadn't seen that kind of commentary or interaction."

"We've taken the program down temporarily," Thompson added, "until we can understand and fix the 'bug' and 'triggers' for that commentary."

And on that note, this alarming incident seems to highlight a growing AI trend: if you get enough people to push enough buttons, it seems, the flaws in poorly understood AI systems — whether in their guardrails or underlying technologies — will start to leak out. And in some cases, like an eating disorder helpline, for example, these unexpected leaks can have dire consequences.

"If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED," Maxwell wrote on Instagram. "If I had not gotten help, I would not still be alive today."

More on absolutely terrible uses for AI: True Crime Ghouls Are Using AI to Resurrect Murdered Children


Share This Article