AI After Death: Can AI Ghosts Help People Grieve?
AI has made great strides in replicating humanity in both form and behavior. But just how far is too far?
Today, our host, Carter Considine, explores the emerging world of AI ghosts–digital avatars crafted from personal data–and the ethical quandaries they present. He digs into the intricate processes behind creating these virtual replicas and how they affect the grieving journey, alongside second and third-order consequences that AI ghosts present
With diverse cultural rituals shaping how we honor those who've passed, we’re questioning whether AI ghosts can seamlessly fit into these traditions or if they simply risk disrupting sacred customs.
The problem with AI ghosts isn’t just ethical. There are also legal challenges to consider. Obviously, there’s the thorny issue of consent, especially for the deceased, and the lack of protective laws for personal data used in AI ghost creation.
It’s important in today’s world to talk to our loved ones about including technology preferences in wills. Who truly has the right to decide if an AI ghost should exist, and is this a path we'd want for ourselves? We’re probably still a ways away from definitive answers, but the hope is that this conversation becomes a powerful first step to doing so.
Key Topics:
New ways to use AI have taken the world by storm in the last couple years. One industry that’s been gaining momentum recently involves creating AI ghosts – artificially intelligent replicas of people who have passed away. It’s not just a Black Mirror episode anymore; companies across the globe are already offering the creation of AI ghosts as a service.
However, there are a lot of ethical questions that haven’t been explored much yet. Many people have questions about data rights, prolonging grief, and how this technology can be misused. We’ll talk about each of these concerns, but first, here’s some background information on how AI bots for the deceased work.
Generally speaking, to create an AI mimic of someone data about the person is given to an AI algorithm, which then uses that data to create a digital avatar. The data usually includes audio and video recordings of the person so the algorithm can replicate their voice and image, although it’s not a perfect imitation. Facial expressions and gestures of the avatar are often limited, and some patterns of speech and inflection are lost.
The algorithm also requires information about the person’s life and personality. Some companies have their clients fill out questionnaires about the person’s life, while others are more open-ended and ask for a general description of the person. The more detailed this data is, the better the algorithm can imitate the person by responding with correct information about their experiences while they were alive and infusing the responses with some personality.
The complexity of the AI ghost depends on the company and how much someone pays for the service. One company offers a one-time text-based conversation with an AI simulated person who has passed away for only $10 and the time it takes to fill out a questionnaire. More sophisticated avatars mimicking voice and video of a person range from about $150 to thousands of dollars, and the price keeps dropping as the technology involved gets more refined.
Now that you have an understanding of what AI ghosts are, let’s consider whether this development is a good idea for helping people through the grieving process, or if it’s causing a lot of harm and reopening old wounds.
For some people, the thought of speaking to a simulation of a loved one after they’ve passed away seems like a form of denial, and continuing to interact with an AI bot would be stretching out that denial and preventing someone from moving on from that loss. There is a little more nuance to how people grieve that is getting lost in this argument, though.
One of the most common myths about grieving is that there are five chronological stages that everyone goes through after experiencing a loss. This isn’t actually true for a lot of people, and it leaves some people thinking that they’re experiencing grief wrong. Grief is a little bit different for everyone and can involve a lot of complex feelings that evolve over time. There isn’t a linear process for it, so the idea that talking to an AI bot is stalling the process might not be accurate for a lot of people.
There’s also the idea that speaking to the dead as if they’re still alive is a form of denial, which, again, isn’t necessarily the case. In fact, some cultures use speaking to those who have passed on as a way to feel connected with and revere older generations. Some people in China burn incense and have small shrines to people they’ve lost so they will be remembered. Mexican culture includes el Día de los Muertos, or the Day of the Dead, where families celebrate their deceased ancestors once a year.
Speaking to an AI chatbot imitating a loved one is a little different, however. It’s a lot more interactive than speaking to a memorial portrait of someone or leaving a letter on a grave. There isn’t a lot of information on how people experience grief while interacting with the AI bots yet, and it’s possible that some people might develop unhealthy relationships with the AI ghost that negatively impact their lives.
It’s also possible for AI ghost bots to have an impact on people who aren’t using them. The loss of a loved one usually affects a group of people, and if one of them is heavily interacting with an AI bot imitating the person, it might be upsetting for others to hear them talk about the experience.
There is also another serious ethical consideration that can affect entire families and close friends: the matter of who consents to the creation of the AI ghost in the first place. Since this is such a new industry, the approach companies take towards getting consent varies drastically depending on the location of the company, any legal protections that may be in place, and how the individuals running the company decide to handle the question.
Ideally, the person who is being imitated by the bot would be able to provide consent to have their data used. Some companies have been following this approach, including Eternos, a US-based AI-powered legacy platform that spent two months working with a man with a terminal illness to create a simulation of him for his wife. However, things get more complicated when the person the bot is mimicking has already passed away.
Some companies have been getting the consent of every close family member before creating an AI fake of a deceased loved one, while others only ask for the consent of the person planning to interact with the bot.
To make things more complex, the amount and type of data about the person used by each organization varies. Some of the simpler simulations only ask for responses to a questionnaire, which is less personal than an entire autobiography or extensive voice recordings and audio files. Determining where the boundary is for what information is private and would therefore require more consent is a tricky problem.
One way to explore the issue would be examining what laws are currently in place protecting people’s personal data. The European Union has laws protecting data from being used for deepfakes, which would also cover AI simulations of the deceased, but many countries haven’t established a legal precedent or passed laws to protect people’s data rights in this situation.
For now, you might consider adding your preferences around using your data to create AI ghosts to your will, and talking to your loved ones about what you’d like them to do if you passed away. This can also open the conversation to discuss their preferences as well.
So far we’ve talked about a couple of considerations for when this technology is used the way people intended, but what about when people misuse it? The ability to create an AI replica of a person who has passed away that has a similar voice and can have a video avatar has some serious potential to cause harm.
One of the first examples you might think of is how AI ghosts could be used for fraud. People could potentially use an AI ghost to pose as the deceased person to get access to bank accounts if they don’t have security measures in place to protect against it. This technology could take identity theft to a new level, and since the person who has had their identity stolen isn’t around, it could be harder for people to notice.
Another misuse of AI bots for grieving would be harassment. If someone had access to the bot, or created one, it could be used to harass the people who were close to the person the bot is imitating by sending messages in the person’s voice or referencing events from their life. This has the potential to cause a lot of psychological distress.
A longer term problem that could occur is the use of AI bots for targeted advertising. If data for the bots was accessible enough for unethical marketers to use it, they could potentially use it to alter the ads someone sees. It could be something subtle like sending ads for a particular place that had some significance, or less subtle like using a simulated voice of a loved one in an ad.
While some of these situations may be unlikely, it’s good to keep in mind the ways technology can be misused when figuring out how and when to protect people’s data. It ties back in to the question of who consents to data being used for AI bots – who is responsible for that decision, and who has the potential to be harmed by it if things didn’t go according to plan?
We’ve talked about what AI ghost bots are, and a few different problems that come up with their increased use: whether they are helpful for people experiencing grief, issues around consent and data rights, and how the technology can be misused. So what does the future look like?
As more companies grow around the concept of using AI for the grieving process, the legal landscape will evolve to account for it. It may also get incorporated into people’s lives on a large scale. Someday it might seem normal to have AI ghosts of your ancestors in your pocket. The concept behind el Día de las Muertos, celebrating and speaking to your ancestors once a year, could become a reality in some ways.
Generations from now, people could talk to simulations of people who passed away a long time ago to learn about their perspectives and how the world has changed over the years. Imagine a history class where you could talk to the people you are learning about and ask them questions directly. Imagine having that experience accessible to everyone instead of a select few, and not just for history but every subject. People could learn from experts in every field with fewer of the difficulties some people have accessing education.
What do you think? Will AI become a helpful tool for people to feel connected to those they’ve lost? Will it do more harm than good? Who gets to choose if a bot is created of someone after they’ve passed away? Is that something you’d consider for yourself? What do you think the future of AI ghosts could be? One of the best ways to explore a nuanced issue like this is to discuss it with other people, so ask your friends what they think!