A case on deepfake AI-based psychotherapy for processing trauma and grief
From The Embassy of Good Science
Cases
A case on deepfake AI-based psychotherapy for processing trauma and grief
Related Initiative
What is this about?
This case study examines the use of deepfake AI technology in psychotherapy to help patients process trauma and grief. AI-generated synthetic media can simulate conversations with a deceased loved one or a perpetrator of trauma, allowing patients to confront unresolved emotions. The approach aims to support emotional closure in situations where real interactions are impossible. Although early research shows potential therapeutic value, the technology raises concerns about psychological safety, consent, and the ethical use of sensitive personal data.
Why is this important?
This case is important because deepfake therapy introduces a new form of AI-mediated psychological treatment that could help patients process trauma or grief when traditional therapy methods are insufficient. However, the technology also raises complex ethical questions about emotional manipulation, privacy, and the boundary between reality and simulation. Ensuring careful clinical supervision, informed consent, and strong safeguards is essential to protect patients and maintain trust in emerging AI-based mental health interventions.
For whom is this important?
Health Care ProfessionalResearchers, Research Ethics Committee MemberIT / Technical ProfessionalRegulatorClinical Researcher
What are the best practices?
Best practices include using deepfake therapy only under strict clinical supervision and when other therapeutic options have been exhausted. Therapists should obtain informed consent from patients and ensure transparency about the simulated nature of the interaction. Sensitive data used to create deepfakes must be securely stored and handled responsibly. Mental health professionals should also evaluate the patient’s psychological readiness and monitor potential risks such as re-traumatization or emotional overattachment to the simulated character.
