Should AI be allowed to resurrect the dead?



With the development of generative AI, services have recently emerged that allow users to create 'AI avatars that recreate the personalities of deceased people and can converse with them.'

James Muldoon , a researcher in AI ethics at the University of Essex in the UK, explained the pros and cons of using AI to resurrect the deceased.

Should AI be allowed to resurrect the dead?
https://theconversation.com/should-ai-be-allowed-to-resurrect-the-dead-272643



A woman named Lolo (not her real name) worked as a content creator in China until she lost her mother in her mid-20s. She had a complicated relationship with her mother, harboring unspoken resentment, and also experienced painful treatment during her childhood.

After his mother's death, Rolo was unable to come to terms with the events of his past and the sudden loss of his mother, so he posted about his anguish on the Chinese social networking site Xiaohongshu . His posts attracted a lot of attention, and eventually the operator of the AI character generation app ' Xingye (Hoshino) ' contacted him and asked if he could create an AI that would recreate Rolo's mother's personality as a public chatbot.

At Hoshino's request, Roro worked to create an AI bot that replicated his mother's personality. 'I wrote about my mother. I recorded all the important events in her life and created a story in which she is resurrected in the world of AI,' Roro said. 'I wrote down the key events in her life that shaped the main character's personality and defined her behavioral patterns. Then, the AI would automatically generate responses. After generating the output, I could tweak the AI to my desired results.'

In the process of creating the AI bot, Rolo began to reinterpret his past with his mother, altering elements of the story to create a gentle, ideal mother figure. The AI bot, 'Xia,' was released as a public chatbot that anyone can interact with.

After Kasumi's release, Rolo received a message from a friend saying, 'Your mother would be so proud,' which brought him to tears. 'It was incredibly healing. Not only did I want to heal myself, but I also wanted to share with other people the words they wanted to hear, which is why I wanted to create an AI avatar of my mother,' Rolo said.



Rolo's story reflects the new possibilities that chat AI offers for grief management: large-scale language models can be trained using personal material left behind by the deceased, such as emails, text messages, voice notes, and social media posts, to mimic the conversational style of the deceased.

Services called 'deathbots' and 'griefbots,' which use AI to recreate deceased people, exist all over the world, including in

China , the United States , and Japan .

While many services do not allow the AI of a deceased person to grow beyond the point of creation, an American company called You, Only Virtual offers a service in which the AI avatars they create can access the internet and grow through conversation. Justin Harrison, CEO of You, Only Virtual, argues that if an AI avatar cannot respond to changes in the times or new information, it is not a true representation of the deceased.

However, services that create AI avatars of deceased people raise difficult questions, such as whether current AI technology is capable of predicting human personality and how interactions with AI avatars affect bereaved families.

Creating and interacting with an AI avatar of a deceased person is a different process than traditional funerals, viewing letters and personal effects, or reminiscing about the deceased. While creating an AI avatar of her mother was healing for Rolo, London-based journalist Lottie Hayton found the experience of reviving her parents with AI eerie and painful. Hayton said the AI's awkward imitation felt disrespectful of her parents' memories, rather than respectful.



Further ethical issues raised by Muldoon include: 'Whose consent is required to create a deathbot?', 'Where is it permissible to publish a deathbot?', and 'How does a deathbot affect the family and friends of those who created it?'

For example, even if one family member wants to create a deathbot to ease their grief, the other family members may not want that. Even if they are allowed to create a deathbot, whether they can publish it on the web or in an app is another matter. Furthermore, there may be arguments such as, 'Did the deceased want a deathbot to be created?' and 'No, that's not what they wanted.'

Muldoon points out that companies developing deathbots are not neutral counselors, but rather commercial platforms driven by incentives such as growth, engagement, and data collection, which means that problems like 'user dependency on deathbots' can be overlooked for the sake of profit.

Despite these risks, it's true that people like Rollo find healing in deathbots, and Muldoon doesn't believe that deathbot creation should be banned outright. Muldoon argues, however, that the decision to revive the dead can't be left solely to startups and venture capitalists. 'The industry needs clear rules on consent, limits on post-mortem data use, and design standards that prioritize psychological well-being over endless engagement with deathbots,' he said.

in AI, Posted by log1h_ik