Image by Getty Images

Content warning: This post contains references to suicide. You can click here for a list of resources from the Suicide Prevention Resource Center, which is not the service discussed in this post.

In news that feels truly grim, Politico published an exclusive report this week that found the Crisis Text Line, one of the busiest suicide crisis hotlines, is sharing data collected from people experiencing trauma and depression with a related entity that's using the data to train customer service agents.

According to Politico's reporting, the nonprofit has a for-profit spinoff called Loris.ai. The Crisis Text Line told Politico that any data it shares with Loris is stripped of details that could trace the data back to a person, thereby making it anonymous. The organization also claimed the info is used to better the customer service AI that Loris.ai sells.

But no matter the intention, such a sharing of personal data is ethically dubious and could be damaging.

"These are people at their worst moments," Jennifer King, privacy and data policy fellow at Stanford University, told Politico. "Using that data to help other people is one thing, but commercializing it just seems like a real ethical line for a nonprofit to cross."

In 2019, researchers from the UK published a study in the journal Nature that found it's often pretty easy to identify people in anonymized datasets.

"As the information piles up, the chances it isn’t you decrease very quickly," Yves-Alexandre de Montjoye, one of the study’s authors, told MIT Technology Review at the time.

In addition, King told Politico that it's hard to say whether users of such a hotline — who the Crisis Text Line presents with a 50-paragraph privacy agreement — can give meaningful consent to such terms considering their mental health at that moment.

Crisis Text Line's vice president and general counsel Shawn Rodriguez told Politico that the nonprofit believes everything it's doing is above board.

"We view the relationship with Loris.ai as a valuable way to put more empathy into the world, while rigorously upholding our commitment to protecting the safety and anonymity of our texters," Rodriguez told Politico. He added that "sensitive data from conversations is not commercialized, full stop."

Despite the reassurances, one big question remains: Even if the data did make the software more effective, is selling stuff through customer service bots really so important that we'd greenlight the exploitation of people in a mental health crisis?

More on questionable public safety: DARE, Which Apparently Still Exists, Is Furious at a TV Show for Dumb Reasons


Share This Article