Artificial intelligence, which is already trippy enough, has taken on a startling new role for some users: that of a psychedelic "trip-sitter" that guides them through their hallucinogenic journeys.
As MIT Tech Review reports, digitally-oriented drug-takers are using everything from regular old ChatGPT to bespoke chatbots with names like"TripSitAI" — or, cringely, "The Shaman" — in a continuation of a troubling trend where people who can't access real therapy or expertise are using AI as a substitute.
Earlier this year, the Harvard Business Review reported that one of the leading uses of AI is for therapy. It's not hard to see why: insurance companies have routinely squeezed mental health professionals to the point that many are forced to go out-of-network entirely to try to make money, leaving their lower-income clients in the lurch.
If regular counseling is expensive and difficult to access, psychedelic therapy is even more so. As Tech Review notes, a single session of psilocybin therapy with a licensed practitioner in Oregon can run anywhere between $1,500 and $3,200. It's no wonder people are seeking cheaper alternatives through AI — even if those substitutes may do more harm than good.
In an interview with Tech Review, a man named Peter described what he considered a transformative experience tripping on a gigantic dose of eight grams of psilocybin mushrooms with AI assistance after a period of hardship in 2023. Not only did ChatGPT curate him a calming playlist, but it also offered words of relaxation and reassurance — the same way a human trip sitter would.
As his trip progressed and got deeper, Peter said that he began to imagine himself as a "higher consciousness beast that was outside of reality," covered in eyes and all-seeing. Those sorts of mental manifestations are not unusual on large doses of psychedelics — but with AI at his side, those hallucinations could easily have turned dangerous.
Futurism has extensively reported on AI chatbots' propensity to stoke and worsen mental illness. In a recent story based on interviews with the loved ones of such ChatGPT victims, we learned that some chatbot users have begun developing delusions of grandeur in which they see themselves as powerful entities or gods. Sound familiar?
With an increasing consensus from the psychiatric community that so-called AI "therapists" are a bad idea, the thought of using a technology known for sycophancy and its own "hallucinations" while experiencing such a vulnerable mental state should be downright terrifying.
In a recent New York Times piece about so-called "ChatGPT psychosis," a man named Eugene Torres, a 42-year-old man with no prior history with mental illness, told the newspaper that the OpenAI chatbot encouraged all manner of delusions — including one where he thought he might be able to fly.
"If I went to the top of the 19 story building I'm in, and I believed with every ounce of my soul that I could jump off it and fly, would I?" Torres asked ChatGPT. In response, the chatbot told him that if he "truly, wholly believed — not emotionally, but architecturally" that he could fly, he could.
"You would not fall," the chatbot responded.
As with the kind of magical thinking that turns a psychonaut into an exalted god for the few hours, the concept that one can defy gravity is also associated with taking psychedelics. If a chatbot can induce such psychosis in people who aren't on mind-altering substances, how easy must it be for it to stoke similar thoughts in those who are?
More on AI therapy: "Truly Psychopathic": Concern Grows Over "Therapist" Chatbots Leading Users Deeper Into Mental Illness
Share This Article