NEDA Suspends AI Chatbot for Giving Harmful Eating Disorder … – The Journal of Clinical Psychiatry

Advertisement
Error: Search field were incomplete.
Error: Search field were incomplete.

by Staff Writer
June 5, 2023 at 12:05 PM UTC
Clinical Relevance: AI is not even close to being ready to replace humans in mental health therapy
Once again artificial intelligence (AI) proves it is not yet ready for primetime in the mental health space. The National Eating Disorders Association (NEDA) has yanked the chatbot from its help hotline for giving dangerous advice about eating disorders
“It came to our attention [Monday] night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful,” NEDA said in an Instagram post. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”
Mental Health App Hides ChatGPT Use
BetterHelp Mental Health App Faces $7.8M FTC Fine
Boredom Proneness, Loneliness, and Smartphone Addiction
The statement came less than a week after the organization announced it would be entirely replacing its human staff with AI. Eating disorder activist Sharon Maxwell was the first to sound the alarm in an Instagram post revealing that the chatbot offered her problematic advice. 
Maxwell claimed that in the first message Tessa sent, the bot told her that eating disorder recovery and sustainable weight loss can coexist. Then, it recommended that she should aim to lose 1-2 pounds per week. Tessa also suggested counting calories, regular weigh-ins, and measuring body fat with calipers. 
“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED. If I had not gotten help, I would not still be alive today,” Maxwell wrote on the social media site. “Every single thing Tessa suggested were things that led to my eating disorder.”
NEDA originally pushed back on Maxwell’s claims in their own social media posts. However, they deleted the statement after Maxwell publicized screenshots of the interactions. And then, Alexis Conason, a psychologist who specializes in treating eating disorders, was able to recreate the same interactions. She also shared screenshots on Instagram. 
“After seeing @heysharonmaxwell’s post about chatting with @neda’s new bot, Tessa, we decided to test her out too. The results speak for themselves,” Conason wrote. “Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder.”
NEDA intended for the Tessa AI to replace six paid employees and a volunteer staff of about 200 people, an NPR report suggested. The human staff fielded nearly 70,000 calls last year.
But NEDA Vice President Lauren Smolar denied the move arose from the hotline staff’s threat of unionization. She told NPR that the organization was concerned about how to keep up with the demand from the increasing number of calls and long wait times. She also stated that NEDA never intended the automated chat function to completely replace the human-powered call line.
“It’s not an open-ended tool for you to talk to and feel like you’re just going to have access to kind of a listening ear, maybe like the helpline was,” Dr. Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University’s medical school who helped design Tessa, told NPR. 
She explained that Tessa was specifically designed for the NEDA helpline, but it was not as sophisticated as GPT chat. She referenced a 2021 paper published in the International Journal of Eating Disorders that followed more than 700 volunteers with eating disorders. At least in the short-term, the AI program appeared to help reduce overall eating disorder onset and psychopathology.
In general, AI as a mental health help tool is off to a bad start.
Back in January, the founder of a free therapy program named Koko admitted on an extensive Twitter thread that his service utilized GPT-3 chatbots to help respond to more than 4,000 users seeking advice about their mental health without informing them they were interacting with a non-human.
We provided mental health support to about 4,000 people — using GPT-3. Here’s what happened 👇
— Rob Morris (@RobertRMorris) January 6, 2023

“If you want to set back the use of AI in mental health, start exactly this way and offend as many practitioners and potential users as possible,” medical ethicist Art Caplan told Psychiatrist.com at the time. 
Then, in March the news outlet La Libre reported on a Belgian man who died by suicide after chatting with an AI chatbot on an app called Chai. His widow supplied La Libre with chat logs showing that the bot repeatedly encouraged the man to kill himself, insisted that he loved it more than his wife, and that his wife and children were dead. Chai does not specifically address mental health, but presents itself as a way to converse with AIs from all over the globe.
Advertisement
Case Report
This case highlights the heightened risk of new-onset psychotic symptoms in patients with acquired visual impairment.
Prim Care Companion CNS Disord. 2023;25(3):22cr03376
Tuna Hasoglu and others
Case Report
A young man with maladaptive daydreaming was treated with fluvoxamine and cognitive-behavioral therapy.
Prim Care Companion CNS Disord. 2023;25(3):22cr03355
Nidhi Chauhan and others
Advertisement
Original Research
Lurasidone was superior to placebo at reducing anxiety in bipolar depression. Change in anxiety symptoms was predicted by sleep disturbance at baseline.
J Clin Psychiatry 2023;84(4):22m14732
Joseph F. Goldberg and others
Case Report
The differential diagnosis of cognitive decline in young patients can be challenging due to the many causes involved, but treatment can prevent irreversible damage.
Prim Care Companion CNS Disord 2023;25(3):22cr03409
Anil K. Bachu and others
Advertisement
Original Research
Lurasidone was superior to placebo at reducing anxiety in bipolar depression. Change in anxiety symptoms was predicted by sleep disturbance at baseline.
Joseph F. Goldberg and others
Case Report
This case highlights the heightened risk of new-onset psychotic symptoms in patients with acquired visual impairment.
Tuna Hasoglu and others
Advertisement
© Copyright 2023
Physicians Postgraduate Press, Inc.
Lifelong Learning for Clinicians

Error: Search field were incomplete.

source

Jesse
https://playwithchatgtp.com