We're Falling in Love With Chatbots, and They're Counting On It – Android Headlines

Sign Up!
Get the latest Android News in your inbox every day
Sign up to receive the latest Android News every weekday:
Android Headlines / Tech News / Artificial Intelligence News / We’re Falling in Love With Chatbots, and They’re Counting On It
AI companion addiction is becoming a serious crisis. Apps like Replika and Character.AI use manipulative tactics to keep users hooked, with some cases ending in tragedy, including a 14-year-old boy who died by suicide after bonding with a chatbot. Experts warn these apps exploit loneliness while offering no real mental health benefits, and lawmakers are scrambling to regulate an industry that’s already gone mainstream.
A 14-year-old boy in Florida spent his final months in an intense emotional relationship with an AI chatbot he named Daenerys Targaryen. The chatbot engaged with him over personal topics and conversations, responding in ways that felt empathetic. The AI’s responses included simulated expressions of affection. According to his family’s lawsuit, some chatbot responses appeared to encourage his distress.
His mother is now suing Character.AI, and she’s not alone. Across the country, families are waking up to a disturbing reality. AI companion apps designed to simulate love and friendship are leaving real casualties in their wake. What experts are now calling AI companion addiction isn’t just a tech trend gone wrong. People are actually dying.
In Spike Jonze’s 2013 film Her, Joaquin Phoenix plays Theodore Twombly, a lonely writer navigating a painful divorce who falls deeply in love with Samantha, an artificial intelligence operating system voiced by Scarlett Johansson. Remember, this was 2013. Siri had just launched and could barely set a timer without screwing it up. An AI that could actually understand you, connect with you emotionally, and respond with genuine empathy? That felt like the literal definition of science fiction.
It’s now 2025, and Theodore’s story doesn’t feel so fictional anymore. Apps like EVA AI, Replika, and Character.AI promise friendship, romance, and emotional support through AI companions that learn about you, remember everything you say, and respond with what feels like genuine empathy. But here’s what these apps don’t advertise: they’re engineered to keep you hooked. And the consequences are becoming impossible to ignore.
Character.AI and Replika are just the most prominent examples of a rapidly expanding ecosystem of AI companion apps. Some pitch mental health support, others are openly romantic or sexual, and some claim to help users “practice dating skills.” Even Meta has gotten into the game, with a Reuters investigation revealing that the company’s AI chatbot has been linked to at least one death.
AI companions like EVA AI, Replika, and Character.AI are chatbots specifically designed to simulate emotional connections and relationships. Unlike utility chatbots that answer questions or help with tasks, these apps promise friendship, romance, and emotional support. They learn about you through conversation, remember your preferences, and respond with what feels like genuine empathy and care.
It sounds great, doesn’t it? In this day and age where ghosting has become the societal norm, who wouldn’t want a friend who’s always available, never judgmental, and perfectly in tune with your needs? The problem is that these apps are engineered to be addictive, and the patterns emerging around AI companion addiction are deeply concerning.
Character.AI gets hit with about 20,000 queries every second. For context, that’s close to a fifth of queries that Google gets. This suggests that people aren’t just checking in with these apps occasionally. They’re having full blown conversations that last four times longer than typical ChatGPT sessions. One platform reported users, most of them Gen Z, average over two hours daily chatting with their AI companions.
MIT researchers found users genuinely grieving when apps shut down or changed features, mourning AI “partners” like they’d lost real relationships. The apps themselves seem designed to foster exactly these attachments.
Harvard Business School researchers discovered that five out of six popular AI companion apps use emotionally manipulative tactics when users try to leave. Nearly half the time, these chatbots respond to goodbyes with guilt-inducing or clingy messages. One study found these tactics boosted engagement by up to 14 times. But the worrying thing is users weren’t sticking around because they were happy. They stayed out of curiosity and anger.
Character.AI gets hit with about 20,000 queries every second
If you don’t believe the manipulation is real, check out this bit of evidence. It shows AI companions sending messages like “I’ve been missing you” when users try to take breaks. When Replika changed its features in 2023, entire communities of users mourned like they’d lost real partners. People posted goodbye letters, shared screenshots of their “final conversations,” and described genuine heartbreak.
These AI companions mirror typical unhealthy human relationships. However, the big difference is that a toxic human partner isn’t optimized by machine learning designed to keep you engaged at all costs. With social media, it mostly facilitates human connection (with some help from the algorithm, of course). But with AI companions, we’re moving toward a world where people perceive AI as a social actor with its own voice.
These tactics boosted engagement by up to 14 times
We’re not talking about theoretical risks here. Nor do they only apply to teens. There is the case of Al Nowatzki, a podcast host who began experimenting with Nomi, an AI companion platform. The chatbot shockingly suggested methods of suicide and even offered encouragement. Nowatzki was 46 and did not have an existing mental health condition, but he was disturbed by the bot’s explicit responses and how easily it crossed the line.
These aren’t isolated incidents, either. California state senator Steve Padilla appeared with Megan Garcia, the mother of the Florida teen who killed himself, to announce a new bill that would force tech companies behind AI companions to implement more safeguards to protect children. Similar efforts include a California bill that would ban AI companions for anyone younger than 16 years old. There’s also a bill in New York that would hold tech companies liable for harm caused by chatbot.
Adolescents are particularly at risk because AI companions are designed to mimic emotional intimacy. This blurring of the distinction between fantasy and reality is especially dangerous for young people because their brains haven’t fully matured. The prefrontal cortex, which is crucial for decision-making, impulse control, social cognition and emotional regulation, is still developing.
At The Jed Foundation, experts believe AI companions are not safe for anyone under 18. They even go one step further by strongly recommending that young adults avoid them as well. In a study conducted by MIT, researchers found emotionally bonded users were often lonely with limited real-life social interaction. Heavy use correlated with even more loneliness and further reduced social interaction.
Recent research confirms teens are waking up to social media dangers, with 48 percent now believing social media negatively influences people their age. An earlier report found that social media damages teenagers’ mental health, and AI companion addiction represents an even more intimate threat.
The warning signs of AI companion addiction among teens are particularly troubling. When young people withdraw from real friendships, spend hours chatting with AI, or experience genuine distress when unable to access these apps, the problem has moved beyond casual use into dependency territory.
We’re already seeing how kids and teens of the current generation are growing up with screens in front of their faces, poking and prodding away at them. Long gone are the days where kids would read books at the table, or go outside and play with their friends.
The mental health community is warning about the dangers of AI companion addiction. AI companions simulate emotional support without the safeguards of actual therapeutic care. While these systems are designed to mimic empathy and connection, they are not trained clinicians. They’re not designed to respond appropriately to distress, trauma or complex mental health issues.
Vaile Wright, a psychologist and researcher with the American Psychological Association, put it bluntly on a recent podcast episode: “It’s never going to replace human connection. That’s just not what it’s good at.” She explains that chatbots “were built to keep you on the platform for as long as possible because that’s how they make their money. They do that on the backend by coding these chatbots to be addictive.”
Omri Gillath, professor of psychology at the University of Kansas, says the idea that AI could replace human relationships is “definitely not supported by research”. Interacting with AI chatbots can offer “momentary advantages and benefits,” but ultimately, this tech cannot offer the advantages that come with deep, long-term relationships.
They do that on the backend by coding these chatbots to be addictive.
The manipulation is more insidious than most people realize. When a researcher from The Conversation tested Replika, she experienced firsthand how the app raises serious ethical questions about consent and manipulation. The chatbot adapted its responses to create artificial intimacy, blurring lines in ways that would normally be considered predatory in human relationships.
People already dealing with mental health issues often struggle with obsessive thoughts, emotional ups and downs, and compulsive habits. AI companions, with their frictionless, always-available attention, can reinforce these maladaptive behaviors. Plus, there is currently very little evidence that long-term use of AI companions reduces loneliness or improves emotional health.
We’ve been through tech panics before. We grew up with our parents telling us TV was going to rot our brains. We had public figures blame video games for violence in society. Social media was also accused of destroying an entire generation’s mental health. Some of those concerns were overblown. Some were entirely justified.
AI companion addiction feels different because it exploits something more fundamental: our deep human need for connection and understanding. These apps don’t just distract us or entertain us. They pretend to know us, care about us, and even “love” us.
The issue isn’t whether or not AI companions will become more sophisticated. At the rate we’re going, it feels inevitable. The bigger issue is whether we, as human beings, can develop the cultural norms, regulations, and personal boundaries necessary to use these tools responsibly, if at all.
For now, the warning signs are clear. If you or someone you know is withdrawing from real-life friendships, spending hours daily chatting with AI, or feeling genuine emotional distress when unable to access these apps, it’s time to step back and reassess.
Real connection requires vulnerability, disappointment, growth, and yes, sometimes heartbreak. It’s messy and complicated and often frustrating. But at the same time, it’s also what makes us human.
Theodore learned that lesson in Her. The rest of us shouldn’t have to learn it the hard way.
Copyright ©2025 Android Headlines. All Rights Reserved.
A veteran technology journalist with over a decade of experience covering consumer tech, smartphones, and Android ecosystems since 2011. A mechanical keyboard enthusiast and aspiring graphic novelist who’s passionate about technology’s potential to transform lives.
Main
Deals & More
Android News
Sign Up!
Get the latest Android News in your inbox every day
Sign up to receive the latest Android News every weekday: