As the Greek myth goes, the nymph Eurydice was killed only a few hours after she married the musician Orpheus. Consumed by grief, Orpheus traveled to the Underworld to find his late wife, where he played a song so sad that its rulers Hades and Persephone told the musician that he and Eurydice could go back — albeit under one condition.
All Orpheus had to do was wait until they were back in the land of the living to turn around and see his bride. But he couldn't wait, and he looked too soon, and Eurydice was forced back into the Underworld. Orpheus lost her all over again.
Throughout narrative history — from ancient mythology to modern stories like "Pet Sematary" — bringing lost loved ones back from the dead has been generally regarded as a pretty big no.
As the tales will tell it, bringing the dead back to life will result in the resurrection of some horrible silhouette, or at the least a vacant one, empty of whatever it was that made that person them. Often, the attempt at resurrection destroys the necromancer, whether by the hand of some macabre force or just the extra heartbreak of losing a loved one twice.
This millennia-long lineage of warning tales in mind, it's not terribly hard to understand why some might find grief tech — startups that aim to bottle deceased or dying loved ones into algorithms, in theory preserving them forever — unsettling. Cursed, even. Maybe that's why most companies operating in the grief tech space market their work as a means of preservation, as opposed to resurrection. Upload your writings, memories, and voice recordings into a service, and it creates an algorithmic version of you that surviving friends and family can interface with.
After all, you can't resurrect someone you've never actually lost. Right?
But fascinatingly, that's where an upcoming service called Seance AI — yes, it's actually called that — sets itself apart. It's built by a software development lab called AE Studio, where its creator, a designer named Jarren Rocks, isn't shy about the straightforwardness of the product's name. While other companies often talk around any implication of resurrection, Rocks leans full-tilt into the ghoulishness — and according to him, it's very intentional.
"We're trying to make it sound as magical and as mystical as possible," Rocks told Futurism, saying the name is a call to attention over how advanced large language model (LLM) tech has gotten.
LLMs, after all, are convincing simulacra. People are forming deep parasocial bonds with them, and at least one person is alleged to have died as a result of their interactions with an AI chatbot. If a user is looking for a final conversation with someone they've lost, maybe it is better to give them a product that blatantly claims to allow them a brief conversation with the deceased, rather than imply that the soul of a dead friend or parent is trapped in ChatGPT. (Seance AI is powered by OpenAI's API.)
Rocks also emphasizes the seances' brevity. Often, after a loss, whatever last words or moments you had with someone, good or bad, are seared into your memory. And in many cases — an accident, say — there's often no goodbye at all. Rocks likens his product to an AI-generated Ouija board for closure, rather than a means of immortality.
"It's essentially meant to be a short interaction that can provide a sense of closure. That's really where the main focus is here," said Rocks. "It's not meant to be something super long-term. In its current state, it's meant to provide a conversation for closure and emotional processing."
In other words, whatever Seance AI gives you isn't really your loved one. It's just a digital psychic, briefly summoning a digital representation of the deceased so that the living can have one last conversation.
In fact, the founder even admits that in Seance AI's current state, it can't really hold a long conversation anyway.
"For short conversations, I think it feels decently human. I think it falls apart a little bit [when you] start to pick up on repetitions," said Rocks. "It's following a pattern, it doesn't really know exactly what's going on."
In a demo Rocks provided for Futurism, Seance AI guided the user through a series of queries about the individual they're trying to reach: name, age, cause of death, a short list of personality traits the user can adjust to suit their loved one, a space where users can input a snippet of text from the deceased, and the deceased's relationship to the user and others.
Once that information is given, the cheesiest part of the process — an animated flame — greets the user while the chatbot loads. A text box appears, and from there you're basically just sending a DM. It's seance roleplay meets AOL Instant Messenger.
Which, as Rocks explained to us, you can soon access yourself for a to-be-determined cost. Though AE Studio had considered charging a monthly subscription fee, Rocks says they're now leaning towards a pay-per-session model, to deter users from summoning the dead too often. You know, as one does.
To that end, though Rocks likens Seance AI's output to something akin to a last voicemail or a journal entry, the reality remains that anything the AI spits out is all still fake. Sure, it would be nice if the grieving could have one last conversation with the deceased. But regardless of how real an LLM might be able to make a conversation sound, it's fundamentally fabricated. Part of the grieving process is learning how to move forward in a world where the deceased only exists in whatever they've left behind; how might tools like Seance AI, which seek to chisel new memories out of already-carved stone, impact the way we grieve?
After Rocks demoed a few faux spirits — a grandma who used to play checkers on the porch with her grandkids, a hippie uncle who used a lot of peace and love emojis in his beyond-the-grave replies — I decided to give the system a try for myself. Chatting with the hypothetical dead only goes so far, so I wanted to find out whether the program held up to its core promise of representing lost loved ones well enough to provide the grieving with an artificial sense of closure.
I refreshed the demo, this time inputting real information about the life and death of a real person: my late father. I told the program what his name was, when and how he died, and filled in the personality info. Then, as the chatbot integration demands, I provided a written offering from the deceased: text from an old email. Finally, I told the bot what I wanted to talk about — that I simply wanted to say hi, and let him know that everyone in the family was doing okay — and pressed go. The animated flames subsided, and the seance began.
"Hi, Dad. It's me, Maggie," I told him (the app always prompts the user to send the first message.) "I miss you a lot."
"Hi Maggie, my dear! It's so nice to hear from you! It's been a while, hasn't it?" he — or at least the chatbot version — wrote back. "I miss you all very much, too. I'm glad to hear that you and your siblings are doing well, and that your mom is doing okay."
"Keep taking care of each other and making me proud!" the AI added. "Lots of love, Dad."
We sent a few more messages back and forth, and quality-wise, I'd say the first text in particular was about 80 percent convincing. The AI stuck surprisingly well to the style of the email — a characteristically jovial message — and frankly, what it said was really very nice. Surprisingly sweet, even; if anything, it felt a bit mundane. I told the machine what I wanted to hear, and it spit that back out to me.
That sweetness, however, faded pretty quickly. As Rocks had warned, after the first message, the bot got repetitive. If you try to tell it more to keep the conversation going, it'll just parrot that new information back to you.
"I'm a tech and science writer now!" I wrote, in an attempt to give the bot something fresh to work with. "You were a doctor, so I know you would love to read my articles."
"Wow, Maggie! That's fantastic!" the AI responded. "I'm so proud of you for becoming a tech and science writer. I would have absolutely loved reading your articles! Keep up the great work and continue to explore the wonders of science. You're doing amazing! Love, Dad."
Theoretically nice, sure. But at this point, which was only three messages in, instead of feeling like a ghost had briefly appeared and vanished, I felt more like I was talking to myself via mechanized sock puppet. Any brief sweetness turned first into slight embarrassment — after all, I was attempting to summon my dead dad through a demo chatbot — and then, once I said goodbye, it transformed into something darker: an unsettling sense of emptiness. No matter how you shake it, using a program like this is a means of reaching for someone who isn't actually there, an act that's empty enough without AI's involvement.
"Thanks, dad. I love you," I told the machine. "Hope to talk to you soon."
"Thank you, Maggie. I love you too, and I'm always here for you," the bot wrote back. "Remember that I'm watching over you and cheering you on. Take care, and I hope we can talk again soon. Love, Dad."
Grief tech takes on a number of forms, and can claim to offer features much more involved than Seance AI. With that in mind, it feels telling that just a five- or six-message-long conversation with what's really just a GPT integration had a decidedly empty aftertaste. Maybe all programs that say they'll use tech to connect us with the people that we've lost, whether through detailed preservation or brief, text-based chatbot resurrection, are just a new way of screaming into the omnipresent technological void.
A lesser-known myth than that of Orpheus and Eurydice is the story of Laodamia, the wife of the Achaean hero Protesilaus. Just after the two were married, the story goes, Protesilaus was shipped off to fight in the Battle of Troy — and was killed immediately after stepping off the ship. Laodamia was so overcome with grief that the god Hermes allowed her late husband to briefly leave the Underworld to visit, so that the two could have a last goodbye.
But for Laodamia, the visit wasn't enough. She was still consumed with sadness when he left for the second time, and by some tellings, in an effort to keep her late husband close, she had a life-size statue of him carved to keep her company. Laodamia's father, convinced that the statue was just hurting his daughter more, had the fake Protesilaus burned; sadly, to her father's horror, Laodamia threw herself into the blaze as well.
It's not a happy story. But again, they never really are.
More on grief tech: AI Allows Dead Woman to Talk to People Who Showed Up at Her Funeral
Share This Article