Workers taking crisis hotline calls at the National Eating Disorders Association (NEDA) unionized — and just four days later, according to an NPR report, NEDA told its hotline staff that they would be fired and replaced by a chatbot.

Per NPR, the hotline is hugely active. NEDA is the largest eating disorder-focused nonprofit in the US, and its helpline fielded nearly 70,000 calls last year alone. But for all of that volume, staffing was astonishingly slim, with only six paid staffers and a few supervisors, who "train and oversee up to 200 volunteers at any given time," according to the report.

Unsurprisingly, NEDA experienced high volunteer turnover and burnout — after all, on top of the staffing disparity, answering helpline calls is difficult emotional labor — and as a result, workers opted to organize.

"We asked for adequate staffing and ongoing training... we didn't even ask for more money," Abbie Harper, a former helpline associate and unionizer, wrote in a May 4 blog post. "When NEDA refused [to recognize our union], we filed for an election with the National Labor Relations Board and won on March 17."

But the company's leadership apparently didn't take well to the union push, announcing in a call just a few days thereafter that the nonprofit would wind down the crisis hotline entirely. Instead, they would introduce a "wellness chatbot" named Tessa — and fire the nonprofit's human call-takers in the process.

"We will, subject to the terms of our legal responsibilities, begin to wind down the helpline as currently operating," NEDA board chair Geoff Craddock told the hotline's former employees in that March call, audio of which NPR obtained. "With a transition to Tessa, the AI-assisted technology, expected around June 1."

According to its website, Tessa, which has technically been in operation since 2022, isn't a crisis bot — in fact, when you log onto the service, that's the first thing that it'll tell you. It's designed instead to deliver something called "Body Positive," which is described as "an interactive eating disorder prevention program."

"Through Body Positive," reads the site, "chatters learn about contributing factors to negative body image and gain a toolbox of healthy habits and coping strategies for handling negative thoughts."

Tessa's creators have launched a staunch defense of the automated tool, arguing since it can handle more volume than NEDA's former fleet of volunteers, it'll be more effective.

"The chatbot was created based on decades of research conducted by myself and my colleagues," Ellen Fitzsimmons-Craft, a psychiatrist at Washington University and the leader on the team that built Tessa, told Vice. "I'm not discounting in any way the potential helpfulness to talk to somebody about concerns. It's an entirely different service designed to teach people evidence-based strategies to prevent and provide some early intervention for eating disorder symptoms."

It's certainly a grim turn for employment politics — after all, we can definitely imagine a nightmare world in which employers start to dangle automated machines like Tessa over their human employees' heads as leverage.

But Tessa's implementation also brings up a whole other set of issues regarding responsibility. Sure, humans make mistakes, but at least there's accountability there. When a machine learning system makes a mistake, who's accountable?

Though NEDA and Tessa's creators promise that the bot isn't ChatGPT and, as the NEDA spokesperson told Vice, can't "go off the rails," any computer can fail.

"We, Helpline Associates United, are heartbroken to lose our jobs and deeply disappointed that the National Eating Disorders Association (NEDA) has chosen to move forward with shutting down the helpline," Helpline Associates United told Vice in a statement. "We're not quitting. We're not striking. We will continue to show up every day to support our community until June 1st."

"A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community," they added.

More on chatbots: Widow Says Man Died by Suicide After Talking to AI Chatbot


Share This Article