How AI Like ChatGPT Can Successfully Coach Humans on Empathy – The Journal of Clinical Psychiatry

Advertisement
Error: Search field were incomplete.
Error: Search field were incomplete.

by Liz Neporent
September 6, 2023 at 10:05 AM UTC
Clinical Relevance: AI tools might be able to help psychiatrists enhance empathy in patient interactions
Tim Althoff, an assistant professor at the University of Washington, explores how artificial intelligence ( AI) boosts empathy in mental health support roles. His research focuses on how natural language processing programs like those used in ChatGPT, Google Bard, and similar technology are useful for teaching people to imagine “walking in someone else’s shoes.” 
The Black Physician Experience
Early Career Psychiatrists
What One Patient with Adult ADHD Wants Clinicians to Know
One area Althoff has studied closely is using AI to provide feedback that helps peer supporters express empathy more effectively in online conversations. He explained that, though AI programs themselves have no inner dialog infused with emotions, they excel at analyzing large datasets to uncover insights about human emotional expression. As Althoff told Psychiatrist.com, “What the machines are really good at is helping people find the right words.”
For example, his team trained machine learning algorithms on thousands of anonymized posts from a mental health peer support platform. Human annotators labeled each post with a score for empathy based on established frameworks from psychology literature. This created a rich dataset linking language patterns to empathic expression. The algorithms can now automatically suggest subtle phrase adjustments to make responses more caring and understanding.
Althoff provided an illustration: “So if someone wrote ‘Don’t worry, I’m here for you,’ the AI might say that ‘Don’t worry’ can be invalidating, and suggest replacing it with ‘It must be really hard dealing with that.’” The machine cannot actually experience or comprehend emotions, but it identifies statistical connections between word choices and their empathetic impact.
The idea is to foster human-AI teamwork, not replace human counselors. “It’s important that it’s focused on human-AI collaboration and how these tools can really enhance interactions that are already happening both in peer support settings and other settings as well,” Althoff stressed.
Based on their work in the lab, Althoff and his team developed an AI system called EMPATH that analyzes a peer supporter’s draft response to an individual in distress. It pinpoints areas where the message could be more empathic then offers subtle changes in wording or phrasing to improve it. For instance, EMPATH might propose adding an exploratory question at the end of an interaction such as, “Have you talked to your boss about this?” to demonstrate interest in the person’s experiences.
In studies, 69 percent of peer supporters who tested EMPATH said it made them feel more confident communicating empathy. Althoff believes the same methods could also assist clinicians in training. However, he cautioned that mental health is an extremely high-stakes domain requiring meticulous safeguards. Misuse or overreliance on AI could risk harming vulnerable individuals.
“I struggle with [preventing misuse] because I think, like many forms of technologies, there can be good use and bad,” Althoff said. “I can imagine the use case where you try to sound empathic to really be more effective at manipulation.”
Althoff remains optimistic about AI’s potential advantages in the mental health space. He works closely with a diverse stakeholders including clinical psychologists, practicing clinicians, program developers, and people with lived experience of mental health to rigorously test his assumptions and evaluate the pitfalls. This human-centered, cooperative design helps maximize benefits while protecting users’ welfare, he said.
In Althoff’s view, AI could expand access to quality mental health support by scaling the efficiency and potency of human skills. The overarching goal of his work is exploring thoughtful ways to empower human connections and caregiving through technology. 
But Althoff wants to be clear: AI must remain a tool, not a substitute for true interpersonal interactions. 
“I think tech space tools cannot fully replace a human,” Althoff said. “With the right precautions and design, machines could help human providers find the right words to nurture emotional bonds and deliver support more effectively.”
Advertisement
Case Report
A patient with mitochondrial dysfunction developed severe depression following the delivery of her child.
Prim Care Companion CNS Disord 2023;25(5):23cr03501
Yuki Kageyama and others
Case Series
These cases outline the pharmacologic and ethical challenges of managing acute mania in COVID-19 patients.
Prim Care Companion CNS Disord 2023;25(5):22cr03480
Tyler J. Thompson and others
Advertisement
Original Research
Greater risk-taking has been associated with suicide attempts in youth with major depression. This study examined whether the association between self-harm and risk-sensitive decision-making is true for bipolar disorder as well.
J Clin Psychiatry 2023;84(5):22m14693
Mikaela K. Dimick and others
Brief Report
The COVID-19 pandemic has affected the lives of healthy individuals as well as people with chronic diseases due to associated stress.
Prim Care Companion CNS Disord 2023;25(5):22br03476
Ipek Midi and others
Advertisement
Original Research
Greater risk-taking has been associated with suicide attempts in youth with major depression. This study examined whether the association between self-harm and risk-sensitive decision-making is true for bipolar disorder as well.
Mikaela K. Dimick and others
Case Report
A patient with mitochondrial dysfunction developed severe depression following the delivery of her child.
Yuki Kageyama and others
Advertisement
© Copyright 2023
Physicians Postgraduate Press, Inc.
Lifelong Learning for Clinicians

Error: Search field were incomplete.

source

Jesse
https://playwithchatgtp.com