Can apps that recreate people who are no longer with us help with grieving?

Can apps that recreate people who are no longer with us help with grieving?
Can apps that recreate people who are no longer with us help with grieving?

In 2022, a virtual version of Marina Smith, an 87-year-old woman from Nottinghamshire, attended her funeral at Babworth by video to answer a series of questions posed to her by her son and other friends and relatives who were present. Smith, a retired former teacher and co-founder of the only museum dedicated to the Holocaust in England, had accepted a few months earlier, following a proposal from her son Stephen Smith, CEO of the Californian company StoryFile, to “train” an artificial intelligence software capable to create virtual entities using personal data. In January she therefore spent a couple of days talking about her life for a few hours in front of a computer that filmed her via a webcam.

The rapid progress made in the development of software that simulates human conversations, of which ChatGPT is the best-known example, have long drawn attention to the many social and work areas that could be influenced in the future or which have already partly changed . For a few years, several companies have been trying to explore the usefulness and profitability of artificial intelligence also in mourning.

One of the most recent technological developments in managing the grief experiences of bereaved people are “griefbots”. grief, “mourning”, and bot). They are particular chatbots that allow you to simulate conversations with the digital version of a deceased loved one, based on artificial intelligence software that makes use of large linguistic models (large language model, LLM). The data used to train the software can be emails, text messages, voice notes, videos and information in other formats about the deceased.

The companies that offer this type of service, currently mostly American, usually do not explicitly mention bereavement in the description of their apps as a condition in which they can be useful. They don’t do it because in fact they are aimed at a broader audience, which partly overlaps with that of users of apps that allow you to have conversations and relationships with virtual partners starting from certain indications provided by the user. But they do not mention mourning also to partially circumvent underlying legal and ethical issues, relating not only to the processing of personal data but also to the risks in the use of apps by individuals in particular conditions of vulnerability and emotional fragility.

– Read also: There are those who are very attached to their “virtual partners”

Describing an app called Vera AI, which claimed to let users create imaginary copies of friends and family (currently no longer available), the technology site Futurism cited the critical reactions of several users on social media regarding the availability of services of this type in app stores. He also noted that, although it is not clear what kind of data the app was using, in an earlier version of the description page on Google Play the installation asked for permission to also access “USB storage” and “photos, multimedia content and files” contained in the device.

A California company called AE Studio provides an even more explicit service, Seance AI, based on large language models and described as «the meeting of artificial intelligence and the afterlife», for the possibility that «your dear ones speak to your heart once again” (seance in English it means “seance”). Another service called YOV, an acronym for You, Only Virtualallows you to create virtual people (“versions”) and is described by the provider as a way to keep “your loved one with you forever” so you “never have to say goodbye”.

One of the aspects that complicates the already complicated issue of artificial intelligence applied to bereavement is that the technology present in services of this type fuels an industry primarily oriented towards profit. Most apps for creating virtual people using artificial intelligence software are paid. Manufacturers that explicitly mention the concept of the afterlife – or, more allusively, the impossibility of talking to real versions of people – limit themselves to recommending temporary use of the apps to overcome difficult moments. And business opportunities proliferate in a legislative gray space where the lines between health and wellness apps are not clearly and rigidly defined.

Even assuming that companies are not solely motivated by commercial interests, the main problem with grief apps that rely on artificially reproducing relationships with the deceased is that there is no scientific evidence of the benefits of this practice. Indeed, there is widespread suspicion in public opinion that they may increase the risk of counterproductive and unhealthy routines, interfering with the psychic processes underlying the mourning process.

– Read also: How long should mourning last?

Maintaining some form of psychological bond with the deceased person is not in itself a pathological phenomenon or a behavior to be avoided, assuming it is possible to do so. Some studies have long maintained that the mourning process actually involves a progressive evolution of that bond and a remodulation of the meaning attributed to it in ways that favor adaptation to the loss. Having a photograph of a deceased loved one on hand, seeing it again on video or listening to it in old recordings, for example, can help some people cope with and overcome the grief of that death.

In certain circumstances, however, a lasting bond can become “maladaptive”: this is the main risk to which users of apps that use artificial intelligence seem exposed in the long term not so much to remember deceased people but, in a certain sense, to imitate them . The companies that produce them, rather than developing them irresponsibly and without criteria suggested by specialists, should talk to people who think they might need this technology in order to satisfy specific needs, he told the magazine Undark Carla Sofka, social studies teacher at Siena College, New York, and expert in grief technology.

“Every person is different in the way they process pain,” added Sofka, who does not exclude that griefbots could become a new tool for some of them to deal with it. However, they could create the illusion in other users that the loved one is not dead, and force them to face a second loss if they decide to stop using the service. The more concrete concern shared by several public health and technology experts is that i social chatbots can generally activate mechanisms of emotional dependence on virtual conversations that engage people for a long time – especially those in mourning – and exclude them from social life.

For a research project, a group of scholars from the University of Kent, in the United Kingdom, and the Kyoto Institute of Technology, in Japan, conducted a series of interviews in 2023 with ten people who, following a bereavement, had used apps for creating of virtual people. Most of them continued the virtual conversations for less than a year, using the apps especially in the early stages of grief. The people interviewed also said that their goal was not to create virtual versions of the deceased in order to have a lasting relationship with them over time.

According to the research group, further studies will be needed to understand how griefbots could influence grief processing in the future. In the absence of public data provided by the manufacturing companies, the group’s difficulty in recruiting people to use them suggests that they do not currently represent a widespread application of artificial intelligence. But things could change, especially if the growing availability of such services coincides with a growing shortage of qualified mental health professionals.

 
For Latest Updates Follow us on Google News
 

PREV Ukraine Russia, war news today: Moscow shoots down US missiles
NEXT Israel – Hamas at war, today’s news live | New York, police raid Columbia University: dozens of pro-Gaza protesters arrested