AI therapy works best when patients feel emotionally connected to chatbot – study – Health Tech World

The research highlights what authors say is key to effective chatbot therapy, and the risks of “synthetic intimacy” (forming human-like bonds with AI).

Professor Dimitra Petrakaki from the University of Sussex, said: “Synthetic intimacy is a fact of modern life now.
“Policymakers and app designers would be wise to accept this reality and consider how to ensure cases are escalated when an AI witnesses users in serious need of clinical intervention.”
With more than one in three people in the UK using AI to support mental health or wellbeing, according to Mental Health UK, the study examined feedback from Wysa, an NHS Talking Therapies app.
University of Sussex researchers reported that users often described the app as a “friend, companion, therapist and occasionally partner”, and that therapy was “more successful” when people developed emotional intimacy with their AI therapist.
NHS trusts are using apps such as Wysa and Limbic to aid self-referral and support patients on waiting lists.
Researchers said intimacy with AI can arise through a “loop”: users disclose personal information, feel gratitude, safety and freedom from judgement, then see shifts in thinking and wellbeing, such as greater self-confidence and energy.
Dr Runyu Shi, assistant professor at the University of Sussex, said: “Forming an emotional bond with an AI sparks the healing process of self-disclosure.
However, she warned patients risk being “stuck in a self-fulfilling loop”.
“The chatbot fails to challenge dangerous perceptions, and vulnerable individuals end up no closer to clinical intervention,” she explained.
Researchers said chatbots were “increasingly filling the gaps left by overstretched services”.

Other Aspect Health Media Titles

Copyright © 2025 Aspect Health Media Ltd

source

Jesse
https://playwithchatgtp.com