spectator.org
Artificial Afterlife
Calum Worthy, the former Disney Channel star who played the goofy sidekick on Austin & Ally, is no longer popping open canned laughter. Now, he’s inviting the living to crack open the coffin and unscrew the urn. His new artificial intelligence (AI) app, 2wai, doesn’t bring people back; it turns your deceased loved ones into digital ideas — avatars programmed to answer questions, dispense advice, and perform emotional labor on demand. Your grandmother is no longer gone; she is a virtual pet you can now unleash on your iPhone.
The incentive is obvious: money. Each interaction, each digital conversation, each emotional hit becomes data, engagement, and profit.
“What if the loved ones we’ve lost could be part of our future?” Worthy asked in the caption of his video ad posted Tuesday on X.
The ad shows an expecting mother talking to her grandmother on the phone about her pregnancy with “Baby Charlie” — except, this is no normal phone call. “Grandma” stands in a liminal space with her hands folded, coaching her granddaughter how to interact with the growing child in her belly. From 10 months of pregnancy to Charlie’s childhood to a 30-year-old Charlie informing his dead grandmother that she will soon be a great grandmother, the app intends to flaunt its tech at the intersection of life and death.
All it takes is three minutes of footage of the real grandmother speaking and moving, and the AI can generate decades of simulated interaction. Worthy hails this as “building a living archive of humanity,” a way for loved ones we’ve lost to “be part of our future.” But the technology has sparked intense backlash, with critics calling it dystopian and unsettling. Social media users across the political aisle have likened the app to digital necromancy, warning that what 2wai offers is less comfort than a commodification of grief.
Grief is not a malfunction that technology needs to “fix.” It is one of the most universal and necessary experiences humans go through. The very finality of death forces maturity, reflection, and eventual acceptance.
But 2wai blurs this finality. It doesn’t just give you manufactured memories of the dead — it turns them into an idea, a curated version of who they were, packaged for your convenience. By converting someone into a programmable concept, it strips them of their complexity, unpredictability, and autonomy that made them real. Suddenly, your grandmother isn’t your grandmother; she’s a product of what the AI predicts she would say or do. The dead are no longer people. They are concepts to be manipulated, summoned, and controlled.
Would 2wai’s version of your grandma dare scold you, or lash out at you, or make you feel bad for leaving your dishes in the sink? Almost certainly not. The app is designed to soothe, to comfort, and to deliver precisely the emotional hit you want — but in doing so, it cannot replicate the trust texture of memory.
Memories of the dead are rarely just sad; the ones that sting the most are often the happy ones. These fleeting moments of laughter, of shared secrets, of being seen and loved in ways that can never be repeated — those are the memories that linger, that ache, and that make loss unbearable. 2wai cannot recreate that. The warmth, the contradictions, and the tiny imperfections that made those memories feel alive — the AI can only approximate them.
In this sense, 2wai doesn’t just commodify grief. It commodifies desire, the bittersweet pangs of human memory that shape who we are. Instead of learning to sit with loss, the bereaved are encouraged to chase the illusion of presence. Using 2wai, you become a ventriloquist, animating the digital ghost of someone you loved while the real person is gone forever.
Why wasn’t this technology marketed to make a digital clone of the living, or to create a digital therapist that comforts those grieving without donning the skin of the dead?
Instead, 2wai installs another non-transparent tool on a device most of us already carry everywhere: our phones. Phones are supposed to connect us to the world, but in reality, they tether us to engineered emotional feedback loops. Every ping, every simulated smile, and every programmed word is designed to keep us hooked, subtly manipulating our feelings while we scroll, swipe, and tap our lives away. Everyone has a phone, everyone drags with them a path of digital activity, and everyone is quietly being nudged — not just by social media, but now by AI innovations that invoke the dead.
The incentive is obvious: money. Each interaction, each digital conversation, each emotional hit becomes data, engagement, and profit. Yet the more insidious part isn’t the cash; it’s how easily our emotions are harvested, predicted, and shaped. We may believe that our tech-forward society has made us comfortably numb, but the truth is the opposite: our emotions are pent up in the palm of our hand. 2wai turns just another unprocessed feeling into a product, a way to hold onto the ghosts that haunt us. In the end, the app doesn’t just simulate the dead — it lets us cradle and control them, yet we are the ones who are being pulled as puppets.
READ MORE from Julianna Frieman:
The Real Divide Isn’t Red v. Blue — It’s Male v. Female
Don’t Let X Become the Right’s TikTok
Charlie Kirk’s Assassination Exposes a Generation in Crisis
Julianna Frieman is a writer based in North Carolina. She received her bachelor’s degree in political science from the University of North Carolina at Charlotte. She is pursuing her master’s degree in Communications (Digital Strategy) at the University of Florida. Her work has been published by the Daily Caller, The American Spectator, and The Federalist. Follow her on X at @juliannafrieman.