Case study: Generative Ghosts and the Grieving Process
From The Embassy of Good Science
Cases
Case study: Generative Ghosts and the Grieving Process
Related Initiative
What is this about?
This case study examines griefbots, also called generative ghosts, which are AI chatbots designed to simulate the personality and speech of a deceased person. Using large language models and retrieval-augmented generation, these systems can draw on digital traces such as messages, photos, and recordings to imitate how someone spoke and behaved. They allow grieving individuals to interact with a digital representation of the deceased, creating new forms of remembrance and communication.
Why is this important?
This case is important because griefbots may significantly change how people experience grief and maintain connections with deceased loved ones. While such systems may offer emotional comfort, they also raise ethical concerns about identity, consent, and psychological well-being. Questions arise about whether the digital simulation represents the person and how interacting with it might affect the grieving process. Responsible discussion and safeguards are needed before these technologies become widely available.
For whom is this important?
Health Care ProfessionalTechnology Developer / EngineerIT / Technical ProfessionalData Protection OfficerPolicy MakerRegulator
What are the best practices?
Best practices include ensuring transparency that griefbots are AI simulations and not real continuations of a person. Developers should require clear consent for the use of the deceased person’s digital data. Psychological guidance or support should be available for vulnerable users interacting with such systems. Ethical oversight and cultural sensitivity are also important, as attitudes toward death and remembrance vary across societies. Data protection and respectful design should guide the development of griefbots.
