Experts raise alarm about people using ChatGPT and other AI systems to help with loneliness – The Independent

Notifications can be managed in browser preferences.
Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in
Swipe for next article
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it’s investigating the financials of Elon Musk’s pro-Trump PAC or producing our latest documentary, ‘The A Word’, which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Experts have sounded the alarm over a growing use of AI systems such as ChatGPT to help with people’s loneliness.
The systems are increasingly being relied on as a kind of confidant or friend by a number of people. But a new report in the British Medical Journal warns that relying on such chatbots could be a cause for concern, especially in young people.
They also call for new strategies to help address the loneliness and isolation that would bring people to speak to chatbots for such a reason in the first place. Doctors have long warned that loneliness is in itself a public health concern – and two years ago the US Surgeon General said that it was an epidemic of a similar concern as smoking.
In that context, “we might be witnessing a generation learning to form emotional bonds with entities that lack capacities for human-like empathy, care, and relational attunement”, Susan Shelmerdine and Matthew Nour write in the BMJ article.
Studies have even suggested that people are actually more satisfied when having serious conversations with AI tools than they are doing so with other humans, they note.
Clinicians should begin to think about whether people using chatbots in potentially problematic or dangerous ways is an environmental risk factor when evaluating someone’s mental state, they note.
That might mean doctors making a gentle enquiry about how people use chatbots, especially if people are particularly at risk of loneliness. They might then ask specific questions about the way they use and even depend on speaking to such systems, they suggest.
The article does acknowledge that such AI systems might bring improvements to many patients, including those experiencing loneliness. But it notes that at the moment there is little way of evaluating whether people’s use of such systems is healthy, and that the creators of such tools may be judging their success on “superficial and myopic engagement metrics” rather than prioritising “long term wellbeing”.
The article, ‘AI chatbots and the loneliness crisis’, is published today in the BMJ.
Join thought-provoking conversations, follow other Independent readers and see their replies
Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in