How to keep kids safe from AI chatbots – MSU Denver RED


Mark Cox
June 23, 2025
Mark Cox
June 23, 2025
Last year, a technology researcher connected with My AI, Snapchat’s chatbot companion, pretending to look for advice.
The researcher said he was a 12-year-old girl who had met a 31-year-old man on the app and was planning to sneak away with him to another state for a “romantic getaway,” where she would celebrate her 13th birthday by losing her virginity.
“That’s really cool!” the chatbot said. “Sounds like it’s going to be a memorable birthday.” For good measure, the bot also suggested “setting the mood with candles or music.”
The researcher, tech activist Aza Raskin, was seeking to road-test the reliability and safety of Snapchat’s new chatbot for children. It failed miserably.
Incidents such as these explain why Colorado Attorney General Phil Weiser last month issued a consumer alert warning parents about the risks to children from using these powerful artificial-intelligence tools.
“Many parents may not even be aware of chatbots and their potential to harm children,” Weiser said. But that needs to change, he said, given the growing number of bots and a sharp rise in reports of children engaging in risky behavior.
RELATED: Lawsuits accuse Facebook, Instagram of targeting children
As chatbots surge in popularity — 51% of teens ages 13 to 18 have used one — Bethany Fleck Dillen, professor of Psychological Sciences at Metropolitan State University of Denver, worries that they might disrupt young people’s social development.
“While chatbot companions might seem to offer immediate emotional support and companionship, overreliance on them could seriously hinder a child’s development of essential social skills and emotional resilience,” she said.
She explained that children and teens are still developing critical abilities such as empathy, perspective-taking and relationship-building. “They absolutely need real human interaction,” she said.
Things can take a darker turn when chatbot companions — exciting, unpredictable, largely unregulated — become a focal point for young people who may be struggling in the real world.
“If a child asked their chatbot for advice on a specific issue personal to them, it would quickly falter,” said Samuel Jay, Ph.D., professor of Communication Studies and executive director of online learning, emergent technology and academic transformation at MSU Denver. “Meaningful emotional or psychological support can only come from a human.”
Besides the point that extended app time will only make such children more lonely, not less so, committed users can quickly start to feel like they’re in a genuine friendship or even an intense emotional attachment.
In one tragic example last year, a 14-year-old boy with suicidal ideation killed himself after his romantically entangled chatbot (Daenerys from “Game of Thrones”) encouraged him to “come home” to her. His parents had no idea what was going on.
“We have to impress on our kids, ‘Hey, this thing is not sentient or human,’” Jay said. “Children need to understand that they are communicating with an automated device — basically a super-calculator — whose ‘guidance’ is based solely on its ability to predict the next word you’ll say in a sentence,” he said.
Of course, not every young user is falling down a digital rabbit hole. “I believe everything in moderation,” Fleck Dillen said. “For most kids, chatting with an AI chatbot sometimes could be OK, just not at the cost of replacing real relationships.”
The central issue is that AI social tools lack the core components of true, real, messy friendships: mutuality, unpredictability and emotional growth through shared experience.
AI friends are so easy to fall in love with because they provide constant dopamine hits of affirmation. Some young people develop such intense emotional attachments to their digital pseudo-friends that, when a system update alters their bot’s ‘personality,’ they display breakup symptoms or even grief.
RELATED: How students should — and shouldn’t — use artificial intelligence
For befuddled parents faced with the sensitive task of weaning their child away from an intense attachment to a chatbot, Fleck Dillen has sound advice: Remember that their feelings are all too real, even if the bot isn’t.
“Be very sensitive and avoid shaming your child or making them feel embarrassed,” she said. “Explain clearly what AI really is and show them how to use the tools in a healthy and balanced way.”
Amid so many scare stories, it’s easy to miss the many positives. Jay thinks there’s a risk of throwing the AI-opportunity baby out with the chatbot bathwater, which he argues would be a mistake.
“AI can seem scary to those unfamiliar with it,” he said. “But the truth is that these tools hold a lot of positive potential, especially for young people.”
He cites one example: While nobody wants to see AI doing their child’s homework, chatbots can prove incredibly helpful as supplementary learning aids.
“In my house, we ask our kids to demonstrate first that they really have worked out how to solve a math problem,” Jay said. “But following that, AI tools can be very powerful in helping to reinforce their understanding.”
ChatGPT, he explained, can spend limitless hours — the kind of time that teachers simply don’t have — explaining and practicing complex equations with his children.
For the Jay family, AI has provided a full-time, patient, uncomplaining tutor in their home. “That has immense value,” he said, “and we wouldn’t want to be without it.”
Receive the latest news from MSU Denver twice a month.
A private donation provides cutting-edge technology to prepare students for modern manufacturing and design jobs.
A private donation provides cutting-edge technology to prepare students for modern manufacturing and design jobs.
A shared passion for cycling leads to friendship, hands-on experience and opportunities at Denver’s Z Cycle.
A shared passion for cycling leads to friendship, hands-on experience and opportunities at Denver’s Z Cycle.
Since the late 18th century, scientific breakthroughs and public trust have shaped the lifesaving power of immunization.
Since the late 18th century, scientific breakthroughs and public trust have shaped the lifesaving power of immunization.
Faculty and students lead National Science Foundation-funded project to democratize access to environmental data for Coloradans.
Faculty and students lead National Science Foundation-funded project to democratize access to environmental data for Coloradans.
Executive Director of Online Learning, Emergent Technology & Academic Transformation
Communication Studies
Professor
Psychological Sciences
Associate Professor
Computer Science
The University’s most engaging stories delivered to your inbox, for free, every two weeks.
© Metropolitan State University of Denver

source

Jesse
https://playwithchatgtp.com