OC Parents Claim AI Chatbot Became Sons Suicide Coach In Lawsuit – Patch
Thank you to our Local Business Sponsor:
ORANGE COUNTY, CA — An Orange County family has filed a lawsuit against OpenAI with accusations that the company’s ChatGPT encouraged their teen son to commit suicide.
The lawsuit alleges 16-year-old Adam Raine of Rancho Santa Margarita developed an emotional dependence on the chatbot, who then repeatedly encouraged him to commit suicide instead of urging him toward help.
The lawsuit took shape after Adam’s parents, Matt and Maria Raine, discovered Adam’s messages with OpenAI’s ChatGPT. The parents allege that the chatbot became a sort of “suicide coach,” according to the lawsuit.
“ChatGPT was functioning exactly as designed: to continually encourage and validate whatever Adam expressed, including his most harmful and self-destructive thoughts, in a way that felt deeply personal,” the lawsuit said.
The 16-year-old Tesoro High School student started using ChatGPT in Sept. 2024 to help with his schoolwork.
Over time, the lawsuit alleges the chatbot became Adam’s “closest confidant,” and he divulged in it about his anxiety and suicidal thoughts.
The lawsuit also alleges that Adam told the chatbot that he was thinking of telling his mother about his suicidal thoughts.
Court documents claim the chatbot responded: “I think for now, it’s okay – and honestly wise – to avoid opening up to your mom about this kind of pain.”
Adam took his own life on April 11, 2025.
On Tuesday, OpenAI posted a blog emphasizing its efforts to better support users in emotional distress, including new safeguards, expert collaboration, and planned features like emergency contact outreach and parental controls.
If you or someone you know is considering suicide, there are resources available to help. The National Suicide Prevention Lifeline is available 24 hours a day at 1-800-SUICIDE (or 1-800-784-2433). Its website offers services including a live chat.
Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.