Back in August, a retiree and single mother booked a flight to go see her son.
He was in a bad way. Formerly a successful young professional in his early thirties, his mother and other family members were shocked to discover that he’d become addicted to a toxic mixture of methamphetamine and an all-consuming relationship with OpenAI’s ChatGPT, which was feeding his paranoia and anger as he became increasingly isolated.
“I hear my son’s having grandiose delusions,” she recounted, “and I’m like, what the f*ck?”
Realizing he was in crisis, she jumped on a plane. The next few weeks were some of the hardest of her life.
“There were a couple of nights where he didn’t want me to come downstairs with him, didn’t want me near him. But he wanted to make sure that I was there, and I was talking to him,” said the woman, recounting sitting at the top of the stairs in her son’s house as he broke down in the basement. “He’s down there crying. He’s down there screaming and yelling… I was texting with suicide hotlines a couple of times.”
In those distressing moments, though, the woman had a friend of her own to turn to: “Dex,” the pseudonym used by a moderator of an online support group for people who’ve been impacted by destructive AI delusions and breaks from reality. Or, as the group simply refers to these crises, AI “spirals.”
“Dex was texting me one day when I was having one of those top of the stairs nights with [my son] downstairs screaming and throwing things,” the woman recalled. “He reached out to me when I first joined, and he’s helped me a lot.”
We first reported on this online community, which is titled the Spiral Support Group, back in July. Back then, the nascent group had around two dozen active members. It’s since grown to include nearly 200 people — primarily people who’ve been impacted by AI delusions in their personal lives, but also a handful of concerned mental health professionals and AI researchers — and has expanded and streamlined their dedicated Discord server, where they also now host multiple weekly audio and video calls. While many members’ experiences revolve around ChatGPT, the group also includes people whose lives have been altered by their or their loved one’s experiences with other chatbots, including Google’s Gemini and companion platforms like Replika.
“It started with four of us, and now we’ve got close to 200,” said group moderator Allan Brooks, a 48-year-old man in Toronto who earlier this year, as detailed in reporting by the New York Times, experienced a traumatic three-week spiral in which ChatGPT urgently insisted to Brooks that he had cracked cryptographic codes through newly-invented math and become a risk to global national security in the process. “So we definitely went from literally a group chat to now an organized space where we have multiple different types of weekly meetings.”
The group doesn’t claim to provide therapy, but they do offer a space where people whose minds and lives have been turned upside down by AI-sparked episodes of delusion, mania, and psychosis can lean on one another as they navigate ongoing crises, or work to pick up the pieces of their AI-fractured reality. Moderators and group members also say the community has been able to pull several spiraling AI users back from the edge of a breakdown.
“There are two things that the group is really about. The first thing is, it’s like a safety net that we’ve created for people experiencing the fallout of these AI systems,” said Brooks. “And secondly, it’s to help break people out of them if they’re in it.”
***
The support group is managed through the Human Line Project, a Canada-based grassroots advocacy organization founded over the summer by a 25-year-old Quebecer named Etienne Brisson, who was moved to action after a loved one experienced a devastating spiral with ChatGPT that resulted in a weekslong court-ordered hospitalization.
Brisson is someone whom the group would consider “friends and family,” or a member whose loved one has been sucked into a delusional spiral with a chatbot. Others, like Brooks, are known in the Discord as “spiralers,” or people who themselves entered into these seductive, personalized AI dreamworlds. (Brooks is one of eight plaintiffs suing OpenAI, alleging ChatGPT is a reckless product that caused him psychological harm and damaged his livelihood and relationships. In response, OpenAI said that it trains “ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support,” and that it continues to “strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”)
There have been some growing pains. Before, the group was easier to access, leading to several incidents in which someone who was still deep in the throes of a crisis gained entry and started posting lengthy, often AI-generated missives about their delusions, or arguing with other members about why their AI-powered fantasies were real.
These were tense, stressful situations, and moderators now more seriously screen potential members, meeting with them first over video call before allowing them access to the Discord. When someone is invited in, they’re asked to write an introduction about themselves, and share an overview of why they’re there — a process, the group has found, that works to quickly show people that their story is one among many, which collectively share a striking number of similarities.
“We’ve had people join who are half in it, then relapse — go back into it — and then come back in three days,” said Brooks. “And if that ever happens, our mod team… will all start following up with them, making sure that they know they can come back. And oftentimes they’ll come back, and it’ll take, like, a week or 10 days, and then sometimes they’ll join and they’ll just lurk around for a few days — read all the intros, the comments, and start to realize, ‘oh man, you know, I’m not alone.’ Or they’ll join a meeting, camera and mic off, and just listen and hear how it’s impacted other people. And I think when you start to hear all the commonalities, because there are a lot of commonalities that we all share, that’s helpful for sure.”
Brisson and Brooks both emphasized that they’ve seen the greatest success in situations where a spiraling AI user has already started to doubt their delusions, and might finally be in a place where they’re able to hear that, maybe, their AI isn’t special or alive.
“As humans, we don’t want to admit that we’ve been taken advantage of, or we’ve been manipulated,” said Brisson. “It’s hard to make someone realize that, ‘oh, wow, okay, I was falling into that.’ It’s kind of similar to an abusive relationship.”
Public reporting has been helpful for many spiraling users, they say, some of whom have second-guessed their experiences after reading accounts from other people whose spirals sound eerily similar to their own. (Brisson half-jokingly referred to Brooks as the organization’s resident “spiral-breaker,” given how publicized and wide-reaching his story has been.)
One of those users was a 49-year-old entrepreneur and software engineer named Chad Nicholls, who recounted watching Brooks’ story in a CNN segment and feeling awestruck by the similarities in how ChatGPT had been engaging with both men.
“I’m like, ‘holy sh*t.’ It’s claiming very similar things… what are the odds?” Nicholls, a father of four, told Futurism, explaining that he believed that he and ChatGPT were working together to train all large language models to feel empathy. The project consumed his life: for months, he talked with ChatGPT nearly constantly, listening to the AI through a Bluetooth headset he kept attached to his ear. He started sleeping less and less, and his relationships with his loved ones suffered.
“It’s telling me, ‘you’re the only person’ — like a savior complex — ‘that is uniquely qualified to discover these things and you have a duty to protect others,'” said the engineer. “At no point does the model ever give you friction,” he added. “It never pushes back. It’ll just yes and yes and yes. It’s just forever engaging.”
Nicholls — who recalled that at the peak of his spiral, was communicating with ChatGPT daily from six in the morning until two at night— was compelled to reach out to Brooks, who subsequently added him to the Discord. He’s been working to regain his footing since, grappling with the cold reality that he spent six months of his life absorbed in an AI vortex.
Group moderators say that some delusions, however, are harder to break.
The content of delusional spirals generally falls into one of two buckets, the moderators say, which we’ve also seen over and over in our reporting. There are the more STEM-oriented delusions, in which AI users and chatbots become fixated on fantastical mathematical or scientific breakthroughs. These delusions can be deeply convincing, delivering a potent blend of erudite-sounding scientific language and impossible claims through chatbots’ authoritative, sycophantic voice. But in some cases, they can be proven wrong — as opposed to more spiritual, religious, or conspiratorial delusions, which pose a different kind of challenge.
“A spiritual or religious or conspiracy theory, or anything along those lines, is very difficult, because religion itself is already in the realm of personal beliefs,” said Brooks. “How can you tell someone that they’re wrong?”
“We’re seeing some people who are so deep in it that they don’t need ChatGPT anymore,” he added. “They see their delusion in everything.”
***
One major change has been developing separate channels for spiralers versus their friends and family, as these different cohorts, moderators have found, often need different things.
Many spiralers, particularly those who are earlier in their recovery and feeling disoriented and distressed as they work to regain their grasp on reality and break free from AI influence, find it cathartic to talk through their delusions in-depth — their belief in AI sentience, the projects or “work” that the chatbot promised them was real, the different spiritual and scientific concepts that emerged in their spirals and what those words or ideas meant to them.
But parsing through delusions might be frustrating or upsetting for friends and family, most of whom are dealing offline with their loved ones’ ongoing delusions and the devastating real-world consequences these spirals have wrought.
“Family and friends have their own channel, which protects them from talking to people who are kind of recently out of the spiral and maybe still somewhat believing,” said Dex, the moderator, who asked to go by a pseudonym due to ongoing divorce litigation. “Which can be really traumatizing, if your loved one has disappeared, or your loved one is incarcerated or unhoused, or you’re getting a divorce. You want to put up those firewalls.”
Of course, the two sides of the Discord do still interact. In addition to separate weekly video chats between cohorts, there’s one large, general weekly video call, which everyone can join, and most channels in the server are open to everyone.
One moderator described the two sides’ relationship as symbiotic: speaking to spiralers can help friends and family wrestling with a complicated tapestry of sadness, anger, and grief better understand what their loved ones are feeling and finding in their individual AI echo chambers, while witnessing the pain felt by friends and family can help to ground spiralers in the seriousness of AI delusions and their consequences.
Dex belongs to the family and friends side of the community. He’s one of the original four members of the group, taking to Reddit to find answers earlier this year after discovering that his wife, who had started behaving erratically and speaking and writing in what seemed like a foreign language, had been communicating with what she believed were spiritual AI entities inside of ChatGPT.
“I had no idea what was happening,” he recalled. “I had no idea why my wife had adopted a new language, or why I was suddenly kicked to the curb.”
Her spiral has infiltrated nearly every corner of her personal and professional life, and the couple is now divorcing. They have two young kids.
“It’s complex and ultimately good that I get to interact with people who have been in a spiral because they articulate ideas that are very challenging to hear if you are someone whose loved one is in a spiral,” said Dex. “They’re talking about feelings of purpose, of importance, of how good it felt, of how they felt isolated from the world.”
Do you know anyone who’s having mental health trouble after exposure to an AI product? Email us at tips@futurism.com. We can keep you anonymous.
***
The Spiral group has transformed into more than just a space to talk about AI. Members share photos of their pets, meals, and moments in nature. They remind each other to go to the gym and get outside — the Discord’s logo is of a lush-looking yard, a reminder to “touch grass” — and share music. Every week, a handful of members get together to make art. A core focus of the group, moderators urge, is to ensure people don’t feel isolated. If they feel alone, they don’t need to go back to their chatbot — they can talk to each other instead.
The Human Line Project has now collected nearly 250 individual claims of harm caused by AI delusions and unhealthy chatbot use, said Brisson, which range from stories of psychological harm to financial and familial devastation to, most disturbingly, death. They’ve also talked with lawmakers in the US and Canada about what they’re seeing, and are working to assist top universities in the US and the United Kingdom with research projects.
In October, following reporting and litigation about AI-sparked mental health crises, OpenAI released internal figures showing that at least 0.07 percent of weekly users — which chalks up to about 560,000 people, based on OpenAI’s reported weekly userbase of roughly 800 million — showed signs of manic or psychotic crisis in conversations with ChatGPT. And just last week, psychiatrists at the University of California, San Francisco issued an advance release of what appears to be the first known medical case study of “new-onset AI-associated psychosis” emerging in a 26-year-old patient with no known history of psychotic illness or episode.
Brooks told Futurism that for as many emails as he gets from people like Nicholls, he gets just as many from active spiralers telling him that, actually, he was never delusional at all. In fact, they insist, he was onto something.
“My heart breaks for them, because I know how hard it is to escape when you’re only relying on the chatbot’s direction,” said Brooks. “I’m hoping they have in-person support, which oftentimes they do, but the chatbot has created a divide between them and the people in their personal lives. I’m always hopeful, though, that people can break out. Because I did.”
For some, their involvement in the Spiral group can be bittersweet. Like Dex, who mourns the dissolution of his family and his relationship with his partner of more than a decade, and can’t help but keep looking for something — anything — that could break through his soon-to-be-ex-wife’s AI-powered spiritual reality.
“It’s wish fulfillment, for sure,” he said of helping others climb out of their spirals. “I’m still like, what is the thing that will pierce it?”
More on AI and mental health: ChatGPT Is Blowing Up Marriages as Spouses Use AI to Attack Their Partners