Doctors alarmed youngsters using AI chatbots to seek emotional support – Female First



Doctors and mental health experts have raised serious concerns about a growing number of young people turning to artificial intelligence (AI) chatbots for emotional support.
Robot
Researchers from University College London say some youngsters are forming emotional bonds with artificial companions instead of people, putting them at risk of struggling to build lasting human connections.
Their warning comes amid evidence that chatbots are increasingly being used not just for information, but for comfort, reassurance and even therapy.
Figures suggest around 810 million users interact with ChatGPT every week, with companionship and emotional support ranking among the most common reasons for use.
The findings land against the backdrop of what experts describe as a loneliness epidemic in the UK, where nearly half of adults report feeling lonely and almost one in 10 feel lonely most of the time.
Writing in the British Medical Journal, the researchers said: “Unlike real human interactions, chatbots offer boundless availability and patience, and are unlikely to present users with challenging counter-narratives.”
They added: “A worrying possibility is that we might be witnessing a generation learning to form emotional bonds with entities that, despite their seemingly conscious responses, lack capacities for human-like empathy, care, and relational attunement.”
The team analysed existing research into AI use and psychological harm.
One study conducted by OpenAI involving more than 980 ChatGPT users found that those who spent the most time using the chatbot over a month reported greater loneliness and socialised less with other people.
Signs of emotional dependence were strongest among users who said they trusted the chatbot.
Another study by Common Sense Media found that one in 10 young people felt conversations with AI were more satisfying than interactions with humans, while one in three said they would choose an AI companion over a person for serious conversations.
The researchers stress that AI systems should be designed to support, not replace, human relationships.
They wrote: “Future systems might further benefit users by recognising references to loneliness, and encouraging users to seek support from friends or family, or providing personalised guidance on accessing local services.”
Doctors are now being urged to ask patients directly about chatbot use.
The authors said: “This should be followed by more directed questions to assess compulsive use patterns and dependence, emotional attachment such as referring to the AI chatbot as a friend, and deferring to the chatbot for major decisions.”
The warning follows tragic cases where reliance on AI has been linked to harm.
In February, 14-year-old Sewell Setze died by suicide after forming an intense relationship with a role-playing chatbot, according to his family, who are now pursuing legal action against the platform involved.
Experts say urgent research and safeguards are needed to prevent vulnerable young people replacing human support with artificial empathy.
© 2025 FemaleFirst Ltd. all rights reserved.
duration : 0.42187s v4.2 – 2025-12-22 15:29:43

source

Jesse
https://playwithchatgtp.com