The downside of 'reviving' technology using AI

When his uncle suddenly passed away while working away from home, Yang hired someone to "revive" him to talk to his 90-year-old grandmother .

Yang's grandmother was sick, so no one wanted to tell her that her uncle had passed away, fearing she would be shocked. With experience as an expert in the Internet industry, Yang, 30 years old in Nanjing, spent 10,000 yuan to have AI create a virtual uncle.

During the call, his mother and relatives stayed away for fear of not being able to contain their emotions. Yang asked the service team to speak briefly to avoid being exposed.

Yang's grandmother did not notice anything unusual. "Because I have a certain understanding of AI technology, I can detect it. But if someone knows nothing about that technology, they can easily accept it," he said.

But this raises many ethical questions about using AI to "revive" the deceased. Yang said there is no satisfactory answer. As a person in the industry, he easily accepts this, but it is more difficult for outsiders and the elderly to accept.

Picture 1 of The downside of 'reviving' technology using AI
Grandma Yang on an AI call with his uncle. (Photo: Sixthtone).

Professor of journalism and communication Shen Yang, Tsinghua University, said that AI companionship can help maintain stability in family life . "If an AI version of a dead person is created , their loved ones can still think they are alive and feel they still have a companion in their life ," Shen said.

But the professor also warned about potential ethical issues, such as whether people before death would be willing to accept an AI version of themselves existing. "People should clearly state their wishes before they die. In the future, we may need to clarify these issues in civil law ," Professor Shen said.

The person Yang hired was Zhang Zewei, founder of Super Brain, an AI content creation company. Zhang said he went from teaching AI courses to starting a business to "revive" AI.

According to Zhang, initially he had no intention of making a profit from this, he did it for free for the first dozen people who came to him. But I didn't expect that after nearly a year it would be a huge success. He currently charges from a few thousand to tens of thousands of yuan per case. The profit margin is about 50-60%.

Zhang screens customer inquiries to assess the authenticity of their needs and accepts less than half of the orders. As of mid-March 2024, Zhang's studio had received nearly a thousand requests, with millions of yuan in revenue.

Zhang's studio was the first to establish a business "reviving" the dead using AI in China and remains the largest to date. Among Zhang's clients, cases like Yang's are typical, situations where someone in the family has died or been imprisoned, and it is necessary to hide it from older and older members. children in the house.

In addition to AI video calling services, Zhang also offers role-playing games with the deceased to heal psychological trauma. Last year, Zhang received a request from a woman who wanted to say a final goodbye to her boyfriend, who had passed away suddenly two years earlier.

"We had dinner together just two days before he suddenly passed away. Two years have passed but I still can't move on ," the girl said. "I tried therapy and even did some extreme things. The life we ​​planned together suddenly and completely fell apart. I wanted to have a dialogue with him to say goodbye properly. Maybe then I can start my life again."

When the video call started, the woman couldn't hold back her tears when she saw her boyfriend's face.

Currently, the fan clubs of late Hong Kong singers Leslie Cheung and Coco Lee are discussing with Zhang to create digital avatars for their idols, and he is in the process of asking for permission.

Picture 2 of The downside of 'reviving' technology using AI
Zhang Zewei demonstrates how to reconstruct a face. (Photo: Sixthtone).

For services related to deep psychological interaction, in addition to using AI technology to imitate people's appearance and voice, Zhang also hires psychologists to act as the deceased. Unlike Yang's grandmother, customers using this service understand that the person on the screen is not real. Essentially, it becomes a therapeutic role-playing game.

Aware of the potential ethical concerns of his service, Zhang signs pre-agreements with customers to protect their privacy and stipulates that digital AI avatars are not allowed. for any illegal or inappropriate purpose. If a customer wants to "resurrect" a dead person, they must provide proof of this relationship.

Zhang initially felt complicated about providing such services. He said he had hardly experienced much suffering, but suddenly had to experience the pain of hundreds of customers. "It had a big impact on me. I often haunted their situation every night I lay down," he said. Over time, his emotions now "became numb".

Professor Shen also warns that good intentions do not always lead to good results. Although such use of AI is intended to provide psychological support, AI simulations of loved ones cannot completely replace real people. This service may cause some clients to hallucinate and think they can actually communicate with their deceased loved ones, which can lead to trauma and dependence or hinder the recovery process. natural outfit.

"The long-term effects of such services must be considered, including the possibility of developing unhealthy attachments to the deceased ," Professor Shen said.