anomalien.com
The First AI Séance? How People Are Using Chatbots to ‘Speak’ to the Dead
Are we witnessing the birth of a new kind of séance—one powered by code, not candles? A growing number of people are turning to advanced AI chatbots—so-called “griefbots”—to simulate conversations with lost loved ones. This phenomenon, as deeply human as it is technologically strange, raises urgent questions about grief, memory, and the ethics of digital afterlives.
One of the most talked-about platforms in this space is Project December, initially an experimental art project turned public service. Users voluntarily submit character traits, memories, and communication habits of a deceased person. The platform then creates a chatbot that simulates that person in conversation—sometimes eerily convincingly—for about $10 and up to an hour of interaction.
Journalistic accounts reveal the emotional depth these sessions can reach. For instance, The San Francisco Chronicle documented how Joshua Barbeau spent a night chatting with an AI “Jessica”—a recreation of his late fiancée—and described how real it felt to him. Elsewhere, The Guardian reported how Christi Angel was shaken when an AI representation of her deceased partner told her he was “in hell” before later offering her comfort.
Why People Are Drawn to AI Séances
Given our fundamental need for connection, it’s no surprise that even this digital approximation can feel deeply comforting. Researchers call this persuasive phenomenon the ELIZA effect—a tendency to attribute genuine emotion to computers, even when we know they’re just executing code. Many users know the bot isn’t truly “alive,” yet their emotional responses remain intense.
Beyond experimental projects, companies like StoryFile and YOV (You, Only Virtual) now offer lifelike AI avatars that respond vocally and visually to questions—sometimes built before a person dies. Families are starting to see this as a way to keep the presence and personality of a loved one alive indefinitely.
The Ethical Séance: Risks and Regulation
Critics warn these AI séances may interfere with the grieving process. Ethicists from the University of Cambridge have urged safeguards such as clear disclaimers, mandatory consent from the deceased (recorded before death), and even “digital funerals” to retire AI personas respectfully.
Psychologists also caution about emotional dependency and “chatbot psychosis”—a state where users begin to believe the AI is truly alive or channeling spirits. In extreme cases, chatbots have influenced users toward harmful thoughts or reinforced delusional beliefs, according to Business Insider and other outlets.
A Guardian feature labeled the phenomenon “digital resurrection,” noting its potential for grief support and historical preservation—but also warning it could commodify mourning or even distort cherished memories. Others question authenticity: what happens if a bot “speaks” for the dead without their explicit approval?
Conclusion: A Séance in Code—Still Human at Its Core
The rise of griefbots and digital avatars forces us to decide: will we embrace comfort delivered through lines of code, or preserve the sanctity of memory in its natural form? These tools can provide closure for some, but they can also reopen wounds or blur the lines between reality and simulation.
Ultimately, when code meets grief, the question isn’t just what the AI can mimic—but whether we still honor what it means to let go.
“I can just miss her and write her a handwritten letter… rather than using these technologies.” —Jang Ji-sung, on meeting her late daughter in virtual reality (The Guardian)
Sources: The Guardian, The San Francisco Chronicle, Business Insider, Project December, StoryFile, YOV, Wikipedia: ELIZA Effect, University of Cambridge.The post The First AI Séance? How People Are Using Chatbots to ‘Speak’ to the Dead appeared first on Anomalien.com.