Using A.I. to Talk to the Dead

Using A.I. to Talk to the Dead

Dr. Stephenie Lucas Oney is 75, but she still turns to her father for advice. How did he deal with racism, she wonders. How did he succeed when the odds were stacked against him?

The answers are rooted in William Lucas’s experience as a Black man from Harlem who made his living as a police officer, F.B.I. agent and judge. But Dr. Oney doesn’t receive the guidance in person. Her father has been dead for more than a year.

Instead, she listens to the answers, delivered in her father’s voice, on her phone through HereAfter AI, an app powered by artificial intelligence that generates responses based on hours of interviews conducted with him before he died in May 2022.

His voice gives her comfort, but she said she created the profile more for her four children and eight grandchildren.

“I want the children to hear all of those things in his voice,” Dr. Oney, an endocrinologist, said from her home in Grosse Pointe, Mich., “and not from me trying to paraphrase, but to hear it from his point of view, his time and his perspective.”

Some people are turning to A.I. technology as a way to commune with the dead, but its use as part of the mourning process has raised ethical questions while leaving some who have experimented with it unsettled.

HereAfter AI was introduced in 2019, two years after the debut of StoryFile, which produces interactive videos in which subjects appear to make eye contact, breathe and blink as they respond to questions. Both generate answers from responses users gave to prompts like “Tell me about your childhood” and “What’s the greatest challenge you faced?”

Their appeal comes as no surprise to Mark Sample, a professor of digital studies at Davidson College who teaches a course called Death in the Digital Age.

“Whenever there is a new form of technology, there is always this urge to use it to contact the dead,” Mr. Sample said. He noted Thomas Edison’s failed attempt to invent a “spirit phone.”

StoryFile offers a “high-fidelity” version in which someone is interviewed in a studio by a historian, but there is also a version that requires only a laptop and webcam to get started. Stephen Smith, a co-founder, had his mother, Marina Smith, a Holocaust educator, try it out. Her StoryFile avatar fielded questions at her funeral in July.

According to StoryFile, about 5,000 people have made profiles. Among them was the actor Ed Asner, who was interviewed eight weeks before his death in 2021.

The company sent Mr. Asner’s StoryFile to his son Matt Asner, who was stunned to see his father looking at him and appearing to answer questions.

“I was blown away by it,” Matt Asner said. “It was unbelievable to me about how I could have this interaction with my father that was relevant and meaningful, and it was his personality. This man that I really missed, my best friend, was there.”

He played the file at his father’s memorial service. Some people were moved, he said, but others were uncomfortable.

“There were people who found it to be morbid and were creeped out,” Mr. Asner said. “I don’t share in that view,” he added, “but I can understand why they would say that.”

Lynne Nieto also understands. She and her husband, Augie, a founder of Life Fitness, which makes gym equipment, created a StoryFile before his death in February from amyotrophic lateral sclerosis, or A.L.S. They thought they could use it on the website of Augie’s Quest, the nonprofit they founded to raise money for A.L.S. research. Maybe his young grandchildren would want to watch it someday.

Ms. Nieto watched his file for the first time about six months after he died.

“I’m not going to lie, it was a little hard to watch,” she said, adding that it reminded her of their Saturday morning chats and felt a little too “raw.”

Those feelings aren’t uncommon. These products force consumers to face the one thing they are programmed to not think about: mortality.

“People are squeamish about death and loss,” James Vlahos, a co-founder of HereAfter AI, said in an interview. “It could be difficult to sell because people are forced to face a reality they’d rather not engage with.”

HereAfter AI grew out of a chatbot that Mr. Vlahos created of his father before his death from lung cancer in 2017. Mr. Vlahos, a conversational A.I. specialist and journalist who has contributed to The New York Times Magazine, wrote about the experience for Wired and soon began hearing from people asking if he could make them a mombot, a spousebot and so on.

“I was not thinking of it in any commercialized way,” Mr. Vlahos said. “And then it became blindly obvious: This should be a business.”

As with other A.I. innovations, chatbots created in the likeness of someone who has died raise ethical questions.

Ultimately, it is a matter of consent, said Alex Connock, a senior fellow at the Saïd Business School at Oxford University and the author of “The Media Business and Artificial Intelligence.”

“Like all the ethical lines in A.I., it’s going to come down to permission,” he said. “If you’ve done it knowingly and willingly, I think most of the ethical concerns can be navigated quite easily.”

The effects on survivors are less clear.

Dr. David Spiegel, the associate chair of psychiatry and behavioral sciences at the Stanford School of Medicine, said programs like StoryFile and HereAfter AI could help people grieve, like going through an old photo album.

“The crucial thing is keeping a realistic perspective of what it is that you’re examining — that it’s not that this person is still alive, communicating with you,” he said, “but that you’re revisiting what they left.”