Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
91.5 Chapel Hill 88.9 Manteo 90.9 Rocky Mount 91.1 Welcome 91.9 Fayetteville 90.5 Buxton 94.1 Lumberton 99.9 Southern Pines 89.9 Chadbourn
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

Immortalized: Podcast Transcript

Anita Rao 0:00
My first job out of college was as a production assistant for StoryCorps, a nonprofit that records conversations between everyday people and archives them at the Library of Congress. As part of the project, we conducted legacy interviews, where people sit down with a loved one who's experiencing a serious illness and document their life story. A year into the job, that initiative became very personal. My best friend's mom was dying of cancer, and they wanted to record their own end of life conversation. I flew back to my hometown, audio gear in tow. And one afternoon, the three of us curled up on her mom's bed, and I hit record. The moment I will never forget was toward the end of the conversation, when my friend said,

"What do you think I should do when I want to ask you for advice and you're not there anymore?"

Anita Rao
And her mom responded,

Friend's Mom
"Ask anyway. See what happens."

Anita Rao
Until recently, I thought of that moment purely as poetic. But with advances in AI-equipped grief technology, the opportunity to get advice from the dead might not be so far off.

This is part three of "Simulated," a series about love and artificial intelligence from Embodied. I'm Anita Rao.

So far in this series, we've talked about building friendships with AI-generated characters and developing intimacy with sex robots. Today, the AI afterlife.

HereAfter AI 1:50
Welcome to HereAfter. Who do you want to speak with today? Say one for Jane, two for Paul.

Charlotte Jee 1:57

AI Paul 2:00
This is Paul. I'm glad to be back. How are you today?

Charlotte Jee 2:04
I'm well thank you. How are you doing?

AI Paul 2:07
On my part, I'm doing well. Let's start.

Anita Rao 2:12
That was AI Paul talking with his real human daughter, Charlotte Jee, with the help of HereAfter AI, a tech service that allows you to create digital replicas of your loved ones that you can interact with decades after they're gone.

Charlotte Jee 2:26
I always thought that it was a mainstay of science fiction, but that's kind of where it belonged and where it would remain. But increasingly with advances in AI, it's become clear that it's becoming a reality.

Anita Rao 2:38
That's Charlotte again. She's the news editor for MIT Technology Review. An avatar of her father you just met is now nearly four years old. He was created alongside an AI version of her mom, as an experiment of sorts back in 2019.

Charlotte Jee 2:54
There was this event called the Virtual Beings Summit, which was put on a few years ago. And I couldn't attend that event myself. I couldn't send a virtual avatar either. But I was interested in speaking with one of the speakers, James Vlahos, who is the co-founder of HereAfter AI, which is this company that we're talking about, which basically takes people's conversations with loved ones and turns them into fairly easy-to-use apps that they can access through Alexa. So they can basically have conversations with family members like they would with Alexa. But obviously, with a lot more personal connection.

Anita Rao 3:36
So you heard about this, the idea was kind of simmering in your mind, and then the pandemic hit. And it was that period that really kind of shaped the next trajectory of your work with the app. So take me into that time and what was going on for you and your family.

Charlotte Jee 3:51
So I live over in London in the UK. We had some pretty strict lockdowns. During one of those lockdowns, I couldn't see my parents at all. It wasn't legal. So I think there was a period of about six months where I only really saw them through, you know, Zoom, and WhatsApp calls, stuff like that. And obviously, it makes you think about mortality. It makes you think about the fact that the people that you love won't always be there. So it kind of converged, this idea about I miss my mom and dad, what am I going to do if they're no longer around? And we were also exploring this topic for a print edition of our magazine. And I just put my hand up and said, like, "I think this is really interesting. And my parents are, thank God, still around, but I would be really interested to test it on them." Because it's a relatively low stakes way of doing that. And on the one hand, I do sometimes feel a bit fraudulent because I'm testing a technology without really having all the emotions attached to it. But on the other hand, I think it means I can be maybe a bit more kind of objective and detached about it.

Anita Rao 4:55
Can you walk us through the process of what it looks like to collect data for the app. How does — how does this process work?

Charlotte Jee 5:02
For sure. So, at the time, it was basically a process of doing hours of interviews with a very friendly woman whose name I believe was Meredith. And in order to compile the questions that we wanted to ask, my sister and I went through a list of questions they suggested, wrote some questions that we wanted to ask because of course, you know, we have specific memories. Like that time that Granny got stuck in the garage or things that we want to ask. So through a combination of using their suggested questions, writing our own, we came up with these various pages of questions to send to HereAfter so that their questioner, the lovely Meredith that I mentioned, could pose those questions to my parents and record the conversations.

Anita Rao 5:48
A few months after the interviews were recorded, Charlotte got an email in her inbox letting her know that her parents were ready to talk. This is that first conversation.

Charlotte Jee 5:59
Alexa, open HereAfter.

HereAfter AI 6:01
Welcome to HereAfter. Would you rather speak with Paul, or with Jane?

Charlotte Jee 6:07
I'd rather speak with Jane.

AI Jane 6:09
Hello, this is Jane Jee. And I'm happy to tell you about my life.

Charlotte Jee 6:15
[Laughs] Oh my God.

AI Jane 6:16
How are you today?

Charlotte Jee 6:17
I'm well, thanks Mom. How are you?

AI Jane 6:21
Good. There's so much to talk about. My childhood, career and my interests. Which of those sounds best?

Charlotte Jee 6:30
Your childhood.

AI Jane 6:34
I got into trouble as a child because I was very independent. And I liked to exercise my freedom.

Anita Rao 6:57
So that was a clip from the podcast "In Machines We Trust" from MIT Technology Review. So Charlotte, take us back to that moment. Like, what was it like to interact with these avatars of your parents for the first time?

Charlotte Jee 7:11
Oh, it still actually makes me feel kind of emotional. I see, like, hearing — hearing that reaction back. It makes me feel comforted in a funny way. I mean, I get that this is weird. And other people will be listening to this thinking, "This is a really strange thing to do, especially given that your parents are alive." But on the one hand, it kind of pushed me to be like, "Don't wait. Ask your parents these questions now. Don't talk about their childhood with their bot version when they're long gone." You know, we don't discuss this stuff. And we wait. And then kind of you just never really get to have those really, I don't know, human to human conversations with your parents. It's often just like, "How was the weather? How was, what's your—" you know. It's not about the really deep stuff a lot of the time, at least, at least not in British families. I definitely do find it comforting to think A) that that exists. And I have that record there. But also that it's kind of prompted me to just get closer to my parents who are healthy, alive, well and enjoy that time that I have with them now.

AI Jane 8:23
Thanks. That's good to hear. Next, would you rather than I tell you about one of the happiest times in my life? Or about one of the hardest times?

Anita Rao 8:33
How fluid of an interaction experience is it? I mean, we hear the kind of robot, or your mom saying like, "Here are the things we can talk about." But how does it go from there? Like, what happens if you ask a question that she hasn't necessarily recorded an answer to for example?

Charlotte Jee 8:48
Yeah it— I mean, you can take a wrong turn very easily with this. If it's not in the pre-recorded conversations, it just won't be able to handle your question. Basically, it's — the way to think of it is almost more like a library of their memories that you can access rather than a recreation of them as a person. It's almost like, I don't know, if you added, like, the video, a bit like those paintings that you can interact with in Harry Potter. That reminds me a little bit, a little bit of that. Unless it's been recorded with the information that you're asking about, it will just go, "I don't know." And then the conversation takes a complete — it goes to a dead end. You have to change tack. But I actually find that quite reassuring, if you know what I mean. I actually think that the fact that you can see like the sticking plaster and the way that the technologies like jerryrigged together, the fact you can see that kind of reassures you that, "Hey, this isn't actually that crazy advanced. It's it's just the next kind of logical step on from, you know, photo albums or home videos." I don't think it's that crazy different.

Anita Rao 9:56
That reminds me of something I read from the founder of this program and the reason why he created HereAfter AI was really personal. His dad was dying of cancer. And he really wanted to capture his dad's stories before he died. But he wrote in this piece that he wanted the bot to not only showcase who his father was, but how he was, kind of saying it what he wanted it to show its mannerisms, his outlook. How much of the fullness of who your parents are do you feel like is captured in the app?

Charlotte Jee 10:25
Oh, I mean, not that much. I wouldn't say more than like 5% of who they are. Because what you can't get through this app is, you know, the way that they might roll their eyes or shrug their shoulders or, you know, the way they laugh. So it's not really the same. And overall, that's not a criticism, like, that's just inevitable. Technology can't possibly compete with that. And I still think it's a good thing. But I don't think I would ever confuse this for really speaking to an actual person. And I understand that people want to preserve as much of that as they can. But I think we have to be realistic about what technology can and can't do. And we're still a long way from creating something that really really sounds completely like that person. With that said, you know, when this came out, it was just before ChatGPT. So the technology is advancing so rapidly, so don't hold me to my word on that one.

CNN Newscast 11:21
Allowing cars to drive themselves, composing songs that mimic popular artists and producing this digital painting that took the top award at a Colorado art show. This is all the work of artificial intelligence. And right now ChatGPT is rattling the AI world. Turning out stunningly humanoid writing. Just ask Douglas Rushkoff, a renowned author and professor of media culture.

Anita Rao 11:50
That is part of a CNN newscast from December 2022. And even if you're not tuned into the tech world, you probably remember the hype of those early ChatGPT days. But before ChatGPT hit the masses, the company behind it, OpenAI, released the language model it was trained on to some beta testers. And that model, GPT-3, inspired coders and innovators alike to get creative and push boundaries. One of those boundaries was the line between the living and the dead. Using borrowed beta tester credentials, Bay Area programmer Jason Rohrer created a GPT-3 based interface called Project December. For $5, users could interact with built-in chat bots based on people like William Shakespeare and create custom bots. Which is something that appealed to Joshua Barbeau. At that time, in September 2020, Joshua was a 33-year-old freelance writer living in Canada. Grieving his late fiancee, Jessica, who had died eight years before. He used Project December to build the Jessica bot, an artificially intelligent chatbot trained on Jessica's old Facebook messages and texts. Years after dying from a rare liver disease, here she was again. A line of words tumbling out across a black screen with syntax, spelling and humor that sounded just like her. But how?

Charlotte Jee 13:20
So, so this is a crazy answer. And I promise that this is true, and I'm not just saying this because I'm — I don't, I don't know. So basically, we don't fully understand how these AI models work. Like, when it when it comes down to literally how does it produce the responses tt does, we don't entirely know. We do know that large language models work by essentially approximately guessing what the next word is most likely to be. And if it's been trained on a lot of data from that individual, like this was, then it's basically guessing, "Well, normally when this person uses this word, this person, then we'll use that word after." So it's kind of, like, recreating a kind of jigsaw puzzle of what — how that person uses language. It won't be exactly the same. And it is still — it's basically guesswork, but it's getting pretty accurate. And it's getting pretty convincing. Of course, you could also say that when it malfunctions it kind of — the illusion is shattered briefly when you realize, "Huh, that's not something that person would say." But it's becoming more and more convincing and more and more advanced.

Anita Rao 14:24
There is a new model on the market, or a new option of grief tech in the market called You, Only Virtual, and you spoke a little bit with the founder I know before it actually came out but could you tell us a little bit about You, Only Virtual and how it works?

Charlotte Jee 14:39
Sure. So it pretty much works like what we're talking about. It takes your, your communications with someone and turns that into a chatbot. And this guy built it because his mom — I mean it's, it's a common story in this sector. And surprisingly people build these things because they need them, because the people that they they love, you know, are dying. And in this instance, for Justin Harrison, his mum was in the late stages of having cancer. And so she — he said that he communicated with her a lot via text. So he figured he would be able to create a bot that is a kind of rough approximation of his mum. And he's pretty happy with it. But of course, it does mean that it would have to be with someone that you spoke with a lot on texts, specifically, not just on conversations, like the one that we're having now. So it wouldn't necessarily be suitable for everyone. But for some people, yeah, I can I can see that working.

Anita Rao 15:30
Do you know about how consent and privacy works with something like You, Only Virtual? Because I can imagine with HereAfter AI, like, if I'm going to record my life story, I'm, I'm consenting to it because I'm sitting down to do it. But with You, Only Virtual, like, if someone wants to make something of me, do they get my consent? How does that work?

Charlotte Jee 15:49
Yeah, this is actually a pretty major question mark that I have over, over this technology. I mean, we're talking about this company, specifically, but I'm sure it's, it's true for quite a few companies. If the — if the person that the bot is being created of isn't aware that it's happening, I don't really see how you can guarantee consent will exist. Maybe you could create some way that, you know, that person has to opt in. Like, if they are still alive, and you're creating that bot, maybe there's some way that they can, you know, install the app, tick, "Yes, I consent to this." But if that person has passed away, they can't consent. And some people would say, "Well, they're not around anymore. Does it really matter?" But I feel like a lot of people would would think it does matter. And of course, there's also the slightly kind of weird prospect that you could create it of someone who is A) alive and B) completely well and C) doesn't consent. Like an ex, or, I don't know someone that, that you're stalking. Like, that's, that's quite a scary idea to me. And I'm sure, can you imagine if you found out someone had made a bot of you that they speak to all the time, wouldn't you be freaked out?

Anita Rao 16:55
[Laughs] You would be freaked out. You'd feel like you were living in "Black Mirror."

Charlotte Jee 16:58
You'd be so freaked out.

Anita Rao 16:59
So, yeah. Grief tech certainly opens up a world of creepy possibilities, uneasy feelings and ethical concerns. Which makes it the perfect intellectual stomping ground for any up-and-coming philosopher. So when Alexis Elder, who's an associate professor of philosophy at the University of Minnesota Duluth, first heard about the tech from an academic colleague, she was fascinated.

Alexis Elder 17:23
I had done some work on some classical Chinese conceptions of death and grief and mourning and had sort of moved on, and said something about wanting to get back to that someday. And he said, "You know, there's these new grief bots, maybe that's your opportunity."

Anita Rao 17:36
So you started looking into this, you realize there were some interesting case studies, real life examples of this technology, why people were creating. And you came across a story of Eugenia Kuyda, who is the CEO and founder of Replika, which is a chatbot technology that we have talked about a bit in this series. Can you tell me a bit more about her story and some of the questions that it raised for you who — someone who thinks about these questions of ethics and death and mourning.

Alexis Elder 18:05
Eugenia, as I understand it, had seen the "Black Mirror" episode, "Be Right Back," which is about constructing first a chatbot, and then a voice simulation, and eventually a full-on Android replica of somebody's dead partner. And when her business partner, Roman Mazurenko, passed away, she was sort of inspired to put this into practice. So that was sort of an interesting piece for me. But she ended up drawing on both her own history of text conversations with him and inviting some of his friends and family members to contribute, and ended up creating what she calls a an online memorial chatbot where folks can go and interact with him. And so it's not perfect, it's not giving, like, a lifelike or convincing imitation. But folks who knew him and interacted with it have reported that some of the turns of phrase and then, like, the patterns of speech, which are the things that generative AI is really good at, are sort of you really reminiscent of him, and have this kind of evocative quality.

Anita Rao 19:04
So Charlotte, when you were doing your reporting on HereAfter AI, you spoke with a variety of psychologists, and they raised a number of concerns about grief technology. Can you talk a bit about some of the things that they brought up to you?

Charlotte Jee 19:20
Oh you know, um, one thing I will say is actually, I was surprised that there weren't more concerns. A lot of the psychologists I spoke to were actually kind of pretty chill about it. Which I wasn't really expecting, a lot of them are like, "This might actually be a helpful part of grieving." And one thing that could become difficult is — because, of course, people talk about the different stages of grieving, and that difficulty with spending a lot of time interacting with a virtual bot form of that person, is that you could get kind of stuck in the ruminating, sort of thinking, obsessing phase but you just want to remain immersed in that person. And that could that could kind of make it hard for you to move on. Because, you know, I don't think that's such an issue now. But as the technology improves, and becomes more convincing, I think that will become more of a risk that you just think I basically want to stay with this person, I don't want to let go of them. Because part of grief is eventually kind of accepting that they're no longer there.

Anita Rao 20:20
That is something that does happen in that "Black Mirror" episode, Alexis, which is that the character ends up spending a lot of time with a robotic version of her ex instead of the people in her real life. Do you think that grief tech could further isolate people? What do you think about that question?

Alexis Elder 20:38
So this is something I'm really interested in. And with a nod to Charlotte's point about the five stages of grief, the sort of dominant conceptions of how we grieve have been shifting over the years and away from this idea that the point of grief is sort of closure and moving on. And towards this idea of what some clinical psychologists are calling continuing bonds, which is to some extent, like continuing to live with space in your life for the person who mattered to you and whom you've lost. Whether or not people are going to have so much trouble sort of disentangling that from maybe a more pathological engagement that they have difficulty interacting in reality, I think that's an empirical question, and something that we may need to wait and see. But something that I'm interested in as an ethicist is, how much our preconceptions about the pathology of grief might be shaping our assumptions about what's useful. And it's interesting to me, if not entirely surprising, that psychologists are maybe more open than we might expect. Because the idea that there's something harmful about grief, and it's something to be gotten over or gotten past are themselves coming into question. And that was one of the things that sort of drove me back to this more Confucian model, is this idea that, like, living our lives with space to honor and remember and continue to connect with people who have mattered to us might actually be a good thing.

Anita Rao 21:59
So Alexis, maybe you could give us a brief overview of the Confucian approach to death and the afterlife and why it called to you as a place to think about these questions about grief tech.

Alexis Elder 22:11
So Confucius and his followers sort of deny interest in or focus on the afterlife. Like, they don't, they don't say that there is no afterlife, they just when the question comes up, say basically, "We're having enough trouble figuring out how to live let's, let's not worry about that right now." But there's a very strong emphasis on social relations as central human concerns and central to how we live well together in communities. And that includes recognition for people as they come into our lives and as they leave them. Like what do we want to do to welcome people and what do we want to do to commemorate people? And that includes wrestling with difficult questions like, how do we deal with bodies after people have passed away? Not just as an individual psychological problem, but as a social good? How can we continue to organize and support each other in ways that will help us to hold space for our feelings of attachment and longing, and affectionate memories for the dead?

Anita Rao 23:12
This question of how to best commemorate the dead is answered differently in cultures across space and time. In the Western world, common rituals include writing obituaries, organizing funeral processions and laying flowers on a grave. These practices help us grapple with grief and memorialize those we've lost. And for some people, grief tech is just another tool for commemoration. That's the case for Dr. Stephenie Lucas Oney, who has been interacting with her late father through HereAfter AI since his death in May 2022.

Dr. Stephenie Lucas Oney 23:46
One of the things I always miss when someone dies is their voice. The idiosyncratic phrases and their personality that's reflected in it. In the past, I used to keep voicemails after people die just so I could hear the voice again.

Voice Mail 24:08
Good morning. Just thought I would give you a call and see what you were up to. See you later, bye.

Dr. Stephenie Lucas Oney 24:19
When my father died, I got a HereAfter AI avatar to keep a part of him alive, so present and future generations could remember his story and interact with him in a meaningful way. I have hours of videotape and thousands of pictures of my father. I realized though, that the information that I accrued over the years was static and would not convey the true essence of my father. I therefore sent a questionnaire to each of my siblings and to each of his grandchildren and asks them what they would like to know about their grandfather. My siblings and all of the grandchildren were with me when I first used the avatar. Hearing my father's voice again brought us joy, but we also shed some tears. The grandchildren were curious about his childhood in his younger years. His avatar told them stories about his life that they would have never known. My siblings asked for his advice on different topics. I now interact with the avatar using the app, either on my phone or on my computer, about once or twice a month. It is my hope that the wisdoms and strength of character and perseverance and resiliency that my father instilled in me and my siblings can be passed on to future generations through use of this avatar.

Anita Rao 25:52
This idea of a virtual avatar that you can return to over and over again reminds me of other digital memorials that remain intact long after the dead have passed on. I personally know a few people whose Facebook pages have become memorial sites since their passing. And Alexis says this is actually becoming an increasingly common phenomenon.

Alexis Elder 26:13
There's a sort of line in the tech ethics community that Facebook might not have set out to be a virtual graveyard, but as users die, it's kind of become one. Where people will go back to somebody's Facebook profile and post on sort of important occasions, things like birthdays or anniversaries. And I think those sort of speak to an open question that people individually and together face, which is like, how much do we want to keep coming back to the dead? How important is it to sort of have a sustained relationship versus to sort of move on? And I think cultures and individuals are both kind of divided about that. We do have practices of just thinking about, like, cremation and scattering the ashes as a way of kind of both symbolically and literally letting people become one with the world — but also questions about how much we want to preserve affections and connections. And to me that's one of the fascinating things about connecting this contemporary issue back with classical Chinese philosophy is that was something that was being debated then too. There was a cultural tradition of connecting remains to a gravesite. But that was being actively challenged by other philosophical schools at the time, it was certainly not taken for granted.

Anita Rao 27:21
There is also this communal aspect to grief that you get in something like a, you know, a physical funeral, going to a site. You maybe get it even a little bit on a Facebook page where you're seeing other people interacting with the memorial site. How could this kind of grief tech be used in a communal way? Like I guess the examples we have that are people kind of sitting alone in their room chatting with a grief robot. Is there a communal potential for this technology?

Alexis Elder 27:50
I think this is something that is really interesting. And to me, Eugenia Kuyda's online memorial to Roman Mazurenko might come closest to that, is just having the chatbot available to others, right? Having a space in which getting together and conversing with a chatbot might be a practice that the family engages in together. And different people might use that as an occasion to sort of share their own memories or stories or introduce the younger generation to, you know, what grandpa sounded like talking about something or telling a story. So one of the interesting things for me about emerging technologies is we're not just saying like, how do people do this? We're asking normative questions like, how, how do we want to use this? How should we use this? And to some extent, like, the options are, are open to us.

Anita Rao 28:36
We've been talking so far about much more personal use cases of these kinds of digital artifacts, like someone's recording of their grandmother that's shared within the context of their family. But you've come across some examples of instances where digital artifacts of the dead were less kind of private, and also less well received, including an instance where a professor's lectures were used to teach a remote class after he died. Can you tell me a little bit about that story and some of the ways it highlights potential ways this tech could go wrong?

Alexis Elder 29:08
Yes, absolutely. We have a number of social norms and cultural values around respect for the dead that put some important limits on how and when and where we can use remains. And that's, that's starting to spill over to the digital environment as well. During the pandemic, our university recycled some recorded lectures from a professor at sort of random in an online course, in such a way that the students were not clear on the fact that the grading and execution of the course were being done by a living person, nor was it made clear to them that this professor that they saw as teaching their course had passed away. And one of the students searched for his contact information to send him an email or something and came up with his obituary, and the students were horrified. I think rightfully so. So we're, we're sort of left with this question. Like, we have existing norms and laws, and we may need to update the as we continue to have conversations and think about what constitutes appropriate, respectful, exploitative, manipulative use of the dead. Identifying the questions that we need to ask and resolve together is really interesting and is work that I'm trying to do.

Anita Rao 29:15
The other example that comes to mind for me is the Anthony Bourdain documentary, where after he died, they recreated certain sentences set in his voice and put those in a documentary without necessarily alerting to the audience that that was the case. Can you remind us what happened there and some of the questions that raised for you?

Alexis Elder 30:36
Yes, so as I understand it, a filmmaker actually secured the consent of Bourdain's estate, but created some audio deep fakes of Bourdain, seemingly reading out some letters that he had actually written. And there was some public outcry around this, especially among folks who really appreciated Bourdain's authenticity and sort of openness. And it seemed to like contrary to the ethos that they associated with him to have this sort of calculated PR move, basically. To me, it does a couple of things. First, reading out loud is still a creative endeavor. And I don't think that the calls that the deep fake made are necessarily the ones we should expect or associate with Bourdain himself. But more generally, I'm worried that ongoing access to people's works, and this includes both the deceased professor and some of Bourdain's recordings, can make it easy to think that these are just kind of like resources to be exploited for often the financial benefit of the living. And that seems to me deeply disrespectful to kind of reduce people to the value of their outputs, rather than appreciating them as individuals who are sort of unique and meaningful sources of value. Like part of remembering and respecting somebody might consist of letting them be sort of full human beings and not merely thinking about how to make them do the work we want them to. And that seems to be something that's undercut by this sort of easy availability of digital remains without constraints on how they might be used to benefit other people's interests.

Anita Rao 32:15
There are some people who have suggested strategies to help navigate these questions while people are still living, like a type of Do Not Resuscitate order that, like, could be part of your will, where you're like, "Do not reanimate me. Like, don't use my voice." Do you foresee any other options to help people navigate their role in their digital selves before they die? How should we be thinking about this?

Alexis Elder 32:37
I certainly could see that becoming a sort of more normal part of conversations about disposal of somebody's belongings like, just like I might include in my will who gets my tea set, I might also have some things to say about what can or can't be done with my digital remains. But I think there are reasons to think this is going to be difficult to scale up, in part because there's so much more in the way of digital footprints that we leave behind, that it may not even be sort of feasible to keep track of all of those. We may have different interests in relations to them. And just more generally, we may not be able to anticipate all the uses to which they might be put. So I think one of the things we need to do is think about death as everyone's problem. This isn't just about extending one person's interest into the future, but thinking about how do we, the living, want to continue to mourn and remember and show respect for the dead and recognize the extent to which respect is incompatible with an overly exploitative attitude?

Anita Rao 33:52
So all this begs the question: who is this technology for?

Jane Jee 33:58
Hello, my angel.

Charlotte Jee 33:59
Hello, Mommy. How are you?

Jane Jee 34:01
I'm very well, thank you. How are you doing?

Charlotte Jee 34:02
Yes, I'm well.

Anita Rao 34:04
That's Charlotte Jee again. And this time you're hearing her talk to her real mom Jane over the phone. And they're gonna help me answer a question that we can no longer ask Anthony Bourdain, the late professor or the Jessica bot.

Charlotte Jee 34:17
How do you feel about being in bot form potentially, after you've passed away? Does it comfort you or does it creep you out? How does it make you feel?

Jane Jee 34:29
I actually think my feelings about it are not terribly relevant, if I'm honest. I think it's a valid question. A perfectly valid question.

Charlotte Jee 34:40

Jane Jee 34:41
But the point of having the bot is to enable those who you are leaving behind to answer some of the questions that that perhaps they didn't think to ask you while you were alive.

Charlotte Jee 34:58

Jane Jee 34:58
So in a sense, you know, I would think that the main purpose isn't directed towards the person who's, you know, who is the bot.

Anita Rao 35:22
While Charlotte's tech experiment has helped her look head-on at her parents' mortality, there's a lot we don't know about how ghost bots could impact our lives and relationships. Alexis says that when you're building tech, you really need to know what the goal is. And the thing about grief is that culturally, we are pretty undecided about what we want. After a loss, should we be trying to make ourselves feel better, or learning to live with the pain? Acknowledging these questions is essential for a healthy ghost bot future.

Alexis Elder 35:55
I think having more open conversations is going to be key. It's definitely true that different people are going to have different interests and different concerns and different preferences. And I think there are some features of infrastructure that's supporting a lot of technology development that has this kind of like throw it out on the market and see what sticks, and I think for questions like mortality, this isn't to say that everybody needs to settle on one way always, but we need to think about these as social conversations involving important values that we need to work out together. And so I'm really interested in what it takes to have those kinds of conversations where we can sit down and go you know, what, where do we want to go? Like building, to me, building that question into both the development and use of technology is as important as having a worked out goal.

AI Paul 36:52
Well, did you enjoy hearing about that?

Anita Rao 37:00
Embodied is a production of North Carolina Public Radio-WUNC, a listener-supported station. If you want to lend your support to this podcast, consider a contribution at now.

This was the final installment in our series "Simulated." If you missed any of our earlier episodes about AI chatbot relationships or the arrival of the sex robot, definitely go back and give them a listen.

A special thanks to Dr. Stephenie Lucas Oney for her contributions to this episode. The show was produced by Gabriela Glueck and edited by Kaia Findlay. Paige Miranda also produces for our show, Skylar Chadwick is our intern and Jenni Lawson is our sound engineer. Amanda Magnus is our regular editor and Quilla wrote our theme music.

If you like this show and enjoyed our three-part romp through the world of AI and love, we'd be thrilled to hear about it. Write a review and let us know why you listen, or text your favorite episode to a friend. Word of mouth recommendations are the best way to support our podcast.

Until next time, I'm Anita Rao taking on the taboo with you.

More Stories