Leading AI company to ban kids from chatbots after lawsuit blames app for child's death – Fox Business


Quotes displayed in real-time or delayed by at least 15 minutes. Market data provided by Factset. Powered and implemented by FactSet Digital SolutionsLegal Statement.
This material may not be published, broadcast, rewritten, or redistributed. ©2025 FOX News Network, LLC. All rights reserved. FAQNew Privacy Policy
Check out what’s clicking on FoxBusiness.com.
This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).
Popular artificial intelligence (AI) chatbot platform Character.ai, widely used for role-playing and creative storytelling with virtual characters, announced Wednesday that users under 18 will no longer be able to engage in open-ended conversations with its virtual companions starting Nov. 24.
The move follows months of legal scrutiny and a 2024 lawsuit alleging that the company’s chatbots contributed to the death of a teenage boy in Orlando. According to the federal wrongful death lawsuit, 14-year-old Sewell Setzer III increasingly isolated himself from real-life interactions and engaged in highly sexualized conversations with the bot before his death.
In its announcement, Character.ai said that for the following month chat time for under-18 users will be limited to two hours per day, gradually decreasing over the coming weeks.
LAWMAKERS UNVEIL BIPARTISAN GUARD ACT AFTER PARENTS BLAME AI CHATBOTS FOR TEEN SUICIDES, VIOLENCE
A boy sits in shadow at a laptop computer on Oct. 27, 2013. (Thomas Koehler/Photothek / Getty Images)
"As the world of AI evolves, so must our approach to protecting younger users," the company said in the announcement. "We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly."
PARENTS BLAME CHATGPT FOR SON’S SUICIDE, LAWSUIT ALLEGES OPENAI WEAKENED SAFEGUARDS TWICE BEFORE TEEN’S DEATH
Character.ai logo is displayed on a smartphone screen next to a laptop keyboard. (Thomas Fuller/SOPA Images/LightRocket / Getty Images)
The company plans to roll out similar changes in other countries over the coming months. These changes include new age-assurance features designed to ensure users receive age-appropriate experiences and the launch of an independent non-profit focused on next-generation AI entertainment safety.
"We will be rolling out new age assurance functionality to help ensure users receive the right experience for their age," the company said. "We have built an age assurance model in-house and will be combining it with leading third-party tools, including Persona."
A 12-year-old boy types on a laptop keyboard on Aug. 15, 2024. (Matt Cardy)
CLICK HERE TO DOWNLOAD THE FOX NEWS APP
Character.ai emphasized that the changes are part of its ongoing effort to balance creativity with community safety.
"We’re working to keep our community safe, especially our teen users," the company added. "It has always been our goal to provide an engaging space that fosters creativity while maintaining a safe environment for our entire community."

Get a brief on the top business stories of the week, plus CEO interviews, market updates, tech and money news that matters to you.
We’ve added you to our mailing list.
By clicking subscribe, you agree to the Fox News Privacy Policy and Terms of Use, and agree to receive content and promotional communications from Fox News. You understand that you can opt-out at any time.
Quotes displayed in real-time or delayed by at least 15 minutes. Market data provided by Factset. Powered and implemented by FactSet Digital SolutionsLegal Statement.
This material may not be published, broadcast, rewritten, or redistributed. ©2025 FOX News Network, LLC. All rights reserved. FAQNew Privacy Policy

source

Jesse
https://playwithchatgtp.com