A month after she died at the age of 87, Marina Helen Smith spoke at her own funeral.
Smith, the co-founder of the U.K.’s National Holocaust Centre and Museum, addressed her friends and family last week through a prerecorded video. Yet Smith was able to answer some questions during the memorial service, too. After her son, Stephen Smith, asked what she’d say at her funeral, she delivered a brief speech about her life and spirituality. She also answered questions about loved ones who attended the ceremony, creating the illusion of a real-time conversation.
Smith’s interactive video was made using tech from her son’s startup, Los Angeles-based StoryFile. Launched in 2017, the company lets people create videos that can reply to viewers’ questions, using artificial intelligence (AI) to play relevant video clips as responses. Initially conceived as a way to preserve stories of Holocaust survivors and talk to other historical figures, StoryFile’s videos are now showing up at funerals, CEO Stephen Smith said. After losing his mother, Smith understands why.
“I don't find it in the least bit weird to bring up my mother's StoryFile and talk to her,” he told dot.LA. “It's strange to say that but it feels, actually, very natural.”
StoryFile is part of an emerging tech trend practically pulled from the plot of a sci-fi novel. Tech companies have made tools that let people talk to the dead, from digital memorials to chatbots impersonating the deceased. In addition to helping people mourn and remember loved ones, such programs can serve as educational tools. But experts warn similar tech has been deployed for nefarious purposes, like spreading misinformation.
In June, Amazon showed how its Alexa virtual assistant could read a bedtime story in a dead relative’s voice, based on a short audio recording of the deceased person. It’s unclear whether that capability will become an Alexa feature, as it remains in an “exploratory research” stage, a company spokesperson told dot.LA.
Microsoft has also shown an interest in virtually reviving the dead, patenting a chatbot that pulls data from a person’s social media posts. More recently, the software giant said it would restrict customer use of AI that can impersonate someone’s voice, noting the tech “has exciting potential” but could be used to “deceive listeners.”
Tech bringing the dead back to life has shown up in Hollywood, too, and not just as a storyline for Netflix’s “Black Mirror.” In 2019, Disney included late actress Carrie Fisher in a “Star Wars” film by combining real footage of her face with a completely digital character.
StoryFile’s videos aren’t that creepy. At least not yet. The 40-person startup doesn’t put words in anyone’s mouth or try to create new facial expressions like so-called “deepfake” videos. The StoryFile clips use only pre-recorded answers for a limited—but still long—list of possible questions. If you pose a question the subject doesn't have a recorded answer to, they’ll encourage you to ask something else.
That said, StoryFile’s Smith showed dot.LA demos of its more advanced “digital recreations,” which would let people talk to historical figures like Elvis and Albert Einstein, who obviously were unavailable for interviews. Smith believes such videos could potentially be educational, letting students of the future learn physics from a digital Einstein.
The StoryFile idea can be traced to 2010 when Smith’s wife, Heather Maio-Smith, was creating a historical exhibit about Holocaust survivors. She wanted to create “engaging conversations” that would let people interview survivors rather than simply hearing a “linear” oral history, Stephen Smith said. They developed the interactive interviews through a partnership with the University of Southern California. Eventually, the Smiths launched StoryFile to capture stories from historical figures and everyday people. Heather serves as StoryFile’s chief visionary officer.
“How do you communicate best with the past? It's when you ask questions about the past,” Stephen Smith said. “It becomes a relationship with the past, not just that history that's told to you.”
StoryFile’s Conversa AI has been used to create interactive interviews with the still-living likes of actor William Shatner and, more recently, Clarence Jones, the personal counsel of Martin Luther King Jr. The company also has commercial clients using interactive video for customer service or employee training. StoryFiles popping up at funerals, however, was a total surprise, Smith said. Late actor Ed Asner’s memorial notably included such a video.
Other startups see a market opportunity in interactive digital memorials. El Cerrito-based HereAfter AI pairs user photos and audio interviews to similarly let family members talk to recordings of loved ones on their computers, smartphones or smart speakers. Cofounder and CEO James Vlahos got the idea after creating “Dadbot,” a chatbot that shared his father’s life story and personality when he was diagnosed with terminal lung cancer.
Actor William Shatner is interviewed for an interactive video inside StoryFile's L.A. studio.Photo courtesy of StoryFile
Some HereAfter AI customers have recorded interviews for up to 10 hours, Vlahos told dot.LA. “Once you get people rolling, talking about their lives, they have a lot to say,” he noted. The market for this kind of product is still relatively small, with only a handful of direct competitors, he said.
Like StoryFile, HereAfter AI doesn’t use its tech to generate answers to questions that weren’t asked during an interview. Vlahos called that a “sensitive area.” On one hand, letting AI form its own responses would make the chat experience more flexible and powerful. On the other, synthesizing what grandpa might have said starts “crossing that line,” Vlahos said.
“We might get it wrong, and that might be really hurtful to someone,” he said. “Or if nothing else, it might be kind of creepy.”
More advanced “deepfakes,” which use AI to create convincing video and audio hoaxes of someone’s likeness, have gained widespread attention and criticism. Recently, a fake clip of Ukrainian President Volodymyr Zelenskyy made it look like he surrendered to Russia. Fraudsters could deploy similar programs to steal someone’s identity, too, experts said. Unlike the living, dead people can’t correct the record if a video is bogus, creating a unique set of ethical and philosophical questions.
Legitimate commercial ventures deploying the tech make sure users know they’re not talking to a real or living person, said Arizona State University professor Subbarao Kambhampati, who teaches computer science. “But the real issue is, what if you don't know? And that can be very easily done,” he added. As such tech becomes more ubiquitous, Kambhampati predicts more people won’t trust their eyes and ears.
“I think in the longer term, we will get used to it. We will no longer just directly trust what we are hearing and seeing,” he said. “But it's the transition that's going to be tricky, because many people can be taken because we still tend to believe what we hear, what we see, so that skepticism has to increase.”
The tech could also be applied in the metaverse, a nascent vision for the internet where we might work, shop and socialize inside 3D virtual environments. Students may one day strap on virtual reality headsets and watch Abraham Lincoln deliver the Gettysburg Address—then ask the president some follow up questions.
“I think there's a lot of good that will come out of this,” said Majid Abai, founder and CEO of Seena AI, a software and app development firm in Los Angeles. “Now, not only can I see a video, but I can also put a headset on and meet up with a loved one in the metaverse.”
In January, Stephen Smith asked his mother scores of questions over two days to record her StoryFile. She spoke candidly about her upbringing in India and childhood difficulties transitioning to England. Smith said he learned new things about her past, such as when she had measles as a child and was put in quarantine for four months, seperated from her parents.
“I didn’t know that story. She never mentioned it,” Smith said. “Things that she'd [previously] not revealed were revealed. I was grateful for that information about her.”