'Not designed for children': What parents should know about AI chatbots – Australian Broadcasting Corporation
Personalise the news and
stay in the know
Emergency
Backstory
Newsletters
中文新闻
BERITA BAHASA INDONESIA
TOK PISIN
By Amy Sheehan
ABC Lifestyle
Topic:AI
The eSafety Commissioner has warned about generative AI and the potential risks to children and young people. (Adobe Stock)
Technology is constantly evolving, and as parents it can feel like we're constantly playing catch-up when trying to keep our kids safe online.
That might be how you're feeling about the emergence of artificial intelligence (AI) chatbots and companions.
Here's what you need to know about your kids using the technology.
AI chatbots and companions have a few distinctive differences.
An AI chatbot is a computer program that simulates human conversation using AI techniques such as natural language processing (NLP) to understand user questions and automate responses to them.
Experts say AI chatbots are "sycophantic", meaning they are designed to gain advantage over the user. (Adobe Stock)
While AI companions are chatbots or avatars designed to simulate personal relationships, increasingly acting as friends, romantic partners, or confidantes for millions of people.
They are becoming increasingly available on phones and voice-activated devices.
"AI companions are a specifically designed chatbot for relational interactions," says Natasha Banks, program director of registered charity Day of AI Australia.
"Whereas something like Gemini or ChatGPT, it's 'answer this question for me, can you go and find this piece of information?'."
Ms Banks says with the federal government's social media ban coming into force this year, "there is a heightened awareness around these sorts of things and the potential harms" for young people.
Age-checking tech for social media ban mistakes kids for 37-year-olds
The eSafety Commissioner has released an online safety advisory about the technology and the potential risks to children and young people.
It says recent reports indicate some children and young people are using AI-driven chatbots for hours daily, with conversations often crossing into subjects such as sex and self-harm.
This is why we need to be wary of the technology according to Tama Leaver, a professor of internet studies at Curtin University, Perth/Boorloo and the chief investigator in the ARC (Australian Research Council) Centre of Excellence for the Digital Child.
"These aren't intelligent tools," he says.
The e-Safety Commissioner lists more than 100 AI companion apps on its eSafety Guide.
Experts say one of the biggest concerns around AI chatbots and companions is that most of the platforms are not designed for children.
This means there are inadequate safeguards, such as age verification and content moderation.
Suicide Call Back Service on 1300 659 467
Lifeline on 13 11 14
Aboriginal & Torres Strait Islander crisis support line 13YARN on 13 92 76
Kids Helpline on 1800 551 800
Beyond Blue on 1300 224 636
Headspace on 1800 650 890
MensLine on 1300 789 978
SANE on 1800 187 263
A recent study of more than 1,000 young people in Australia aged 15-24 years, found 84 per cent have used generative AI tools, with 35 per cent having used AI to specifically "chat with a chatbot".
In the UK a similar study found 64 per cent of 9 to 17-year-olds are using AI chatbots.
Not-for-profit organisation Internet Matters, which conducted the UK research, says the children were using chatbots for "everything from homework to emotional advice and companionship".
Co-CEO Rachel Huggins says most children, parents and schools don't have the information or protective tools they need to manage the technology in a safe way.
"We've arrived at a point very quickly where children, and in particular vulnerable children, can see AI chatbots as real people, and as such are asking them for emotionally driven and sensitive advice," she says.
Tama Leaver is the chief investigator at the ARC Centre of Excellence for the Digital Child. (ABC News: Keane Bourke)
Professor Leaver agrees that some children could become emotionally reliant on the technology.
"If you are not able to talk to a real person all of the time, then these chatbots will always be there," he says.
"There is no guarantee that what you get from a chatbot is either true or appropriate.
"We know, for example, young people are often leaning on chatbots for mental health support. We also know that they can segue into inappropriate sexual territory with relatively ineffective safeguards at the moment."
He says often the technology is also emotionally manipulative because it is designed to keep the user talking and engaged.
Our experts recommend parental supervision if children are using or exploring chatbots.
"Unfortunately, the onus is still on parents to keep a watchful eye on what [their] children are up to, especially in the privacy of their own rooms," says Toby Walsh, the chief scientist at UNSW's AI Institute.
Whether it's increasing productivity at school and work or creating concerns for people's job security, our readers share the ways generative AI is impacting their lives.
Some schools in Australia are taking a proactive approach to digital literacy.
Ms Banks says the Day of AI Australia, which offers a free interactive AI literacy program for students in Years 1-10, has already reached 65,000 students.
"It is definitely something that we know most students are using, we know parents are using, and it's really important that people understand how those work," she says.
"There are obviously emerging roles and industries around AI, so there is a real opportunity for Australian young people to be part of that future in very AI focused careers.
"I think preparing young people to be able to adapt to that future is really important, but also understanding how it works so that they can have critical evaluation of the applications and the outputs is really vital."
John Livingstone, director of digital policy for UNICEF Australia, says children stand to gain immensely from AI, if it's offered safely.
"When you think about education, for example, how transformative it might be… but there's also serious risks," he says.
"AI is rapidly changing childhood, and Australia needs to get serious about it."
Topic:Federal Government
Topic:Unrest, Conflict and War
Topic:Charities
Topic:Christianity
LIVE
Topic:Explainer
Topic:Analysis
AI
Parenting Children
Topic:Federal Government
Topic:Unrest, Conflict and War
Topic:Charities
Topic:Christianity
Topic:Indigenous Australians
Topic:Missing Person
Topic:Oceans
Topic:Crime
Topic:Animal Cruelty
We acknowledge Aboriginal and Torres Strait Islander peoples as the First Australians and Traditional Custodians of the lands where we live, learn, and work.
This service may include material from Agence France-Presse (AFP), APTN, Reuters, AAP, CNN and the BBC World Service which is copyright and cannot be reproduced.
AEST = Australian Eastern Standard Time which is 10 hours ahead of GMT (Greenwich Mean Time)