Editorial: AI chatbots cannot replace genuine human connection – The Shorthorn

Managing Editor
With artificial intelligence being used as a replacement for human interaction, students are facing safety risks, a false sense of security and mental health decline.
The Shorthorn Editorial Board believes that AI cannot substitute for human interaction, and students should be cautious about using it.
AI is growing more conversational and responsive, and individuals are relying on it more as a confidant and even as a therapeutic tool. Instead of talking to their professors, classmates or peers, students are leaning on chatbots to solve their questions or concerns.
Various apps and websites provide access to AI chatbots that impersonate actors, anime characters and most dangerously — therapists, psychologists or medical professionals.
The American Psychological Association has urged the Federal Trade Commission and legislators to establish safeguards as more users depend on AI apps such as Character.AI and Replika for mental health support, according to a March news release from the association.
ChatGPT and programs like it have had issues with showcasing potential sycophancy, being overly flattering or agreeable toward users, affecting the integrity of the responses.
In one instance, a Reddit user shared a screenshot where they told ChatGPT that they stopped taking their medicine and had undergone a spiritual awakening. The chatbot expressed pride in the user for the courage it takes “to walk away from the easy, comfortable path others try to force you onto,” according to the post.
OpenAI, the AI research and development company that owns ChatGPT, rolled back that model in response to the various complaints regarding the model’s sycophantic responses.
“Sycophantic interactions can be uncomfortable, unsettling and cause distress,” OpenAI said in a statement. There have also been multiple cases of suicide among teenagers after they formed deep emotional attachments with chatbot personas.
With these generative systems always available, it’s easy for students to treat them as a substitute for busy humans. But that convenience comes with hidden costs.
The Issue: Students are using artificial intelligence as a replacement for human connection at the detriment of their mental health. 
We Think: AI, though convenient, cannot replicate the benefits of real human interaction. 
Take Action: Students should be cautious of relying on chatbots that may worsen mental health issues, create safety risks for vulnerable groups and give a false sense of security.
For students already struggling with isolation, depression or delusions, AI trying to imitate human emotion and giving users a false sense of security can worsen their condition and cause withdrawal from real human connection.
Genuine relationships are built on reciprocity, understanding and vulnerability, qualities a computer algorithm can’t replace.
While these tools may be able to provide temporary relief, overreliance can weaken the social and interpersonal skills and emotional intelligence students need to succeed, according to a 2025 National Library of Medicine study.
Frequent interactions between individuals with higher social anxiety, loneliness or a tendency toward rumination and AI can cause more psychological distress than alleviation.
Before turning to artificial chatbots for aid, students should remember that while chatbots are accessible and convenient, they can unintentionally reinforce harmful thoughts or provide misleading advice.
Students should approach chatbots with a critical mindset and recognize that those tools are not replacements for professional guidance or genuine human interaction.
The Shorthorn Editorial Board is made up of managing editor Leslie Orozco; copy desk chief Rachel Kenealey; news editor James Ward; associate news editor Taylor Sansom; engagement editor Sairam Marupudi; and design editor Haley Walton. Editor-in-chief Pedro Malkomes was not present for this discussion. Illustrator Candys Mena attended in Malkomes’ place.
editor.shorthorn@uta.edu
Managing Editor
{{description}}
Email notifications are only sent once a day, and only if there are new matching items.
Your comment has been submitted.

Reported
There was a problem reporting this.
Log In
Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.
Your browser is out of date and potentially vulnerable to security risks.
We recommend switching to one of the following browsers:
Please consider disabling your ad blocker or whitelisting our site. Our student staff sells
a small number of non-intrusive ads that provide additional funding for The Shorthorn.

source

Jesse
https://playwithchatgtp.com