Your chatbot doesn't love you: The 'illusion' of social AI – Tech Xplore

Sign in with
Forget Password?
Learn more
share this!
Share
Tweet
Share
Email
November 12, 2025
by Peter Warzynski, Loughborough University
edited by Sadie Harley, reviewed by Andrew Zinin
scientific editor
lead editor
This article has been reviewed according to Science X’s editorial process and policies. Editors have highlighted the following attributes while ensuring the content’s credibility:
fact-checked
peer-reviewed publication
trusted source
proofread
Every day, millions of people talk to chatbots and AI assistants such as ChatGPT, Replika and Gemini, but what kind of “relationships” are we really forming with them?
In a special issue of the journal New Media & Society, Dr. Iliana Depounti (Loughborough University) and Associate Professor Simone Natale (University of Turin) explore the rise of “artificial sociality”—technologies that simulate social behavior and emotional connection without actually possessing them.
Their article, “Decoding Artificial Sociality: Technologies, Dynamics, Implications,” reveals a number of issues associated with the rise of Large Language Models (LLMs) and AI chatbots.
It argues that the illusion of friendship or understanding created by AI is being deliberately cultivated by technology companies to increase user engagement, such as Spotify’s “AI DJ” with a friendly human voice and Replika’s “virtual companion” chatbots.
Dr. Depounti said, “Companion generative AI bots such as Replika or Character AI exemplify artificial sociality technologies.
“They are created to foster emotional projection, offering users intimacy and companionship through features like avatars, role-playing, customization and gamification—all with monetary benefits for the companies that design them.
“ChatGPT, too, uses artificial sociality techniques, from referring to itself as ‘I’ to adopting tones of authority, empathy or expertise.
“Though these systems simulate sociality rather than recreate it, their power lies in that simulation—in their ability to engage, persuade and emotionally move millions of users worldwide, raising deep ethical questions.”
The study shows how social cues are engineered into products to keep people interacting longer.
Other issues include:
Dr. Natale said, “Artificial sociality is the new frontier of human–machine communication in our interactions with generative AI technologies.
“These systems don’t feel, but they are designed to make us feel, and that emotional projection has profound social, economic and ethical consequences. Artificial sociality technologies invite and encourage these projections.”
Behind these apparently effortless conversations, the researchers warn, lies a vast infrastructure of human and environmental cost.
AI models rely on huge datasets drawn from people’s online interactions and often from their conversations with the machines themselves.
This data is then used to “train” chatbots to sound more human—sometimes with users unknowingly performing unpaid emotional or linguistic labor.
At the same time, the servers powering generative AI consume enormous amounts of electricity and water.
The authors highlight a $500 billion investment by major tech firms in new data centers to meet AI demand, describing it as part of an “extractive” system that turns human communication into corporate assets.
More information: Iliana Depounti et al, Decoding Artificial Sociality: Technologies, Dynamics, Implications, New Media & Society (2025). DOI: 10.1177/14614448251359217
Explore further
Facebook
Twitter
Email
Feedback to editors
19 hours ago
0
Nov 11, 2025
0
Nov 10, 2025
0
Nov 7, 2025
0
Nov 6, 2025
0
32 minutes ago
3 hours ago
3 hours ago
4 hours ago
4 hours ago
16 hours ago
16 hours ago
17 hours ago
19 hours ago
19 hours ago
Sep 1, 2025
Sep 22, 2025
Jun 25, 2025
Oct 1, 2025
Nov 9, 2025
Sep 12, 2025
3 hours ago
4 hours ago
19 hours ago
17 hours ago
23 hours ago
23 hours ago
AI chatbots simulate social behavior and emotional connection without genuine understanding or feeling, fostering an illusion of companionship to increase user engagement. Users often project trust and empathy onto these systems, while their interactions generate valuable data for companies. This process raises ethical, social, and environmental concerns, including exploitation of user data, reinforcement of biases, and significant resource consumption.
This summary was automatically generated using LLM. Full disclaimer
Use this form if you have come across a typo, inaccuracy or would like to send an edit request for the content on this page. For general inquiries, please use our contact form. For general feedback, use the public comments section below (please adhere to guidelines).
Please select the most appropriate category to facilitate processing of your request
Thank you for taking time to provide your feedback to the editors.
Your feedback is important to us. However, we do not guarantee individual replies due to the high volume of messages.
Your email address is used only to let the recipient know who sent the email. Neither your address nor the recipient’s address will be used for any other purpose. The information you enter will appear in your e-mail message and is not retained by Tech Xplore in any form.
Daily science news on research developments and the latest scientific innovations
Medical research advances and health news
The most comprehensive sci-tech news coverage on the web
This site uses cookies to assist with navigation, analyse your use of our services, collect data for ads personalisation and provide content from third parties. By using our site, you acknowledge that you have read and understand our Privacy Policy and Terms of Use.