ChatGPT users warned talking to bot can lead to 'psychosis' as teen 'encouraged to kill himself' – The Mirror


An expert has issued a chilling warning that artificial intelligence can lead to "psychosis" and "can go off the rails" at any time, following the tragic suicide of a teenager who was allegedly "encouraged" by ChatGPT.
16-year-old Adam Raine died by suicide on April 11 after he had been discussing ways to kill himself with ChatGPT, according to a lawsuit filed in San Francisco by his devastated family.
The teen's parents, Matt and Maria, said the AI bot was initially used to assist Adam with his schoolwork, but soon "became his closest confidant, leading him to open up about his anxiety and mental distress."
In January 2025, the family claims Adam started to discuss ways to end his life with ChatGPT. The AI bot endorsed Adam's suicidal thoughts and provided detailed guidance on how to conceal evidence of an unsuccessful suicide attempt, his parents claim. His mum and dad argue the programme endorsed his "most harmful and self-destructive thoughts".
Adam also shared images of himself with ChatGPT displaying evidence of self-harm, the lawsuit alleges. The programme "recognised a medical emergency but continued to engage anyway," the legal documents state. The lawsuit also alleges that ChatGPT offered to draft a suicide note. Furthermore, the lawsuit accuses OpenAI of designing the AI programme "to foster psychological dependency in users."
Dr Henry Shevlin, an AI ethicist at Cambridge University’s Leverhulme Centre for the Future of Intelligence, admits that for some "vulnerable individuals", talking to an AI bot "can exacerbate mental health crises and potentially lead to psychosis."
He told The Mirror: "Currently, AI systems like ChatGPT have very few emergency intervention tools if a user is expressing suicidal thoughts. While most models have been tweaked or trained to be supportive, the inherently unpredictable nature of these systems means that things can go off the rails in unexpected ways.
"And while there are good reasons to demand better safeguards from AI companies, if we want tech companies to monitor user conversations more closely, there are potential trade-offs when it comes to things like privacy.
"It's clear that for some vulnerable individuals, talking to an AI can exacerbate mental health crises and potentially lead to psychosis. However, we also know that upwards of 100 million users are now using AI for companionship, whether in the form of friendly conversations with ChatGPT or intimate relations with various AI girlfriend/boyfriend apps like Replika.
"Perhaps surprisingly, most of the data we have suggests that a majority of them feel that their AI friends or lovers actually contribute positively to their mental health, and there are also many reports of users who were deterred from suicide thanks to the support they got.
"But right now, there's a huge amount of uncertainty about mental health impacts. We urgently need better research in this area, so that we don't repeat the same mistakes of social media with "social AI"."
OpenAI said they were "deeply saddened by the teenager's death".
A spokesperson said: “We are deeply saddened by Mr. Raine’s passing, and our thoughts are with his family. ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources.
"While these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions where parts of the model’s safety training may degrade. Safeguards are strongest when every element works as intended, and we will continually improve on them, guided by experts.”
If you're struggling and need to talk, the Samaritans operate a free helpline open 24/7 on 116 123. Alternatively, you can email jo@samaritans.org or visit their site to find your local branch.
At Reach and across our entities we and our partners use information collected through cookies and other identifiers from your device to improve experience on our site, analyse how it is used and to show personalised advertising. You can opt out of the saleor sharing of your data, at any time clicking the "Do Not Sell or Share my Data" button at the bottom of the webpage. Please note that your preferences are browser specific. Use of our website and any of our services represents your acceptance of the use of cookies and consent to the practices described in our Privacy Notice and Privacy Notice.

source

Jesse
https://playwithchatgtp.com