FDA Advisors Explore a Possible Future With Chatbot Mental Health Care – MedPage Today

Can a chatbot safely deliver mental health care? The FDA’s Digital Health Advisory Committee met last week to grapple with this question to help build appropriate guardrails for future digital health technologies.
The discussion centered around generative artificial intelligence (AI)-enabled devices intended for use in diagnosing, curing, mitigating, treating, or preventing mental health diseases and conditions, according to briefing documents.
About 23% of U.S. adults — 61.5 million people — live with a mental health condition, and roughly one in five children have ever been diagnosed with a mental, emotional, or behavioral health condition, according to the CDC.
Vaile Wright, PhD, of the American Psychological Association, noted that “there will never be enough [mental health providers] to meet the unmet need in this country,” citing health workforce projections through 2037.
Addressing these gaps in care is the most obvious potential benefit of digital mental health tools, members said.
If data from interactions are collected, the tools could provide “an unbelievable opportunity” to improve therapy, noted Thomas Maddox, MD, of the Healthcare Innovation Lab, a joint initiative of BJC HealthCare and Washington University School of Medicine in St. Louis.
However, there are potential harms, including inaccurate identification of medical symptoms and bots that impersonate licensed clinicians.
Ray Dorsey, MD, MBA, of Atria Health and Research Institute in New York, said the idea of using a smartphone to address a crisis in part “fueled” by smartphones “should give us all pause.”
“I don’t know if we’re quite ready to replace psychiatrists with a bot,” he added.
Asked about ways to mitigate potential risks, Jessica Jackson, PhD, of Mental Health America in Alexandria, Virginia, suggested a “single tap” option to escalate concerns to a human for support. Omer Liran, MD, of Cedars-Sinai Medical Center in Los Angeles, recommended tracking side effects, such as AI psychosis (when people develop or experience worsening paranoia and delusions due to prolonged interaction with AI chatbots), increased isolation, and erosion of human relationships.
John Torous, MD, MBI, of Beth Israel Deaconess Medical Center in Boston, added that chatbots must prove they don’t lead to outcomes like suicide. In one recent case, the New York Times reported that ChatGPT may have contributed to 16-year-old Adam Raine’s suicide.
The committee was also asked about the risks and benefits of tools being available without clinician oversight, as well as devices used to both “autonomously diagnose and treat” major depressive disorder and other mental health conditions related to sadness.
With tools that would not require clinician oversight, Liran pointed out that “if there’s any possible self-harm … it’s just on the patient to seek help.” Torous suggested creating a system that could return safety data in real time.
On the whole, committee members agreed that the science for biomarkers detecting depression to allow for self-diagnosis is not there yet.
To ensure the safety of such tools, Dorsey called for an approach in which devices would first be used with clinician oversight in clinical trials among a large study population with long follow-up.
As for using digital mental health tools in younger people, members expressed discomfort. Chevon M. Rariy, MD, of Oncology Care Partners, said she’s a strong proponent of AI, but noted that she is also a mother who has seen the negative impact of smartphones on her children’s play, creativity, and connection.
Jackson pointed out that some states have purchased digital tools to help students cope with their problems and they appear to be “helpful.”
Given concerns about children’s screen time and continuing brain development, members also suggested dosage requirements and potential shutdown or lock-out features when a dose is exceeded.
Depending on the age of the user, one member noted that companies could build such apps into teddy bears.
“If the FDA ever has a ‘how long you can hug your teddy bear’ label,” said Committee Chair Ami B. Bhatt, MD, of the American College of Cardiology, we can “say it was because of us.”
The material on this site is for informational purposes only, and is not a substitute for medical advice, diagnosis or treatment provided by a qualified health care provider.
© 2005–2025 MedPage Today, LLC, a Ziff Davis company. All rights reserved.
MedPage Today is among the federally registered trademarks of MedPage Today, LLC and may not be used by third parties without explicit permission.