A Psychologist Reveals 2 Dangers Of Falling For An AI Girlfriend – Forbes

Here’s why looking for love in an AI system can be a twisted, and lonely path, according to research
AI girlfriends are chatbots, powered by large language models like GPT-3, or more recently, GPT-4, that can simulate coherent and diverse conversations through voice, text or images. These are virtual companions that users can “train” to be their ideal girlfriends. Popular examples of AI girlfriend apps include Replika, Eva AI, My Virtual Girlfriend, Judy, Secret Girlfriend Sua, Your AI Girlfriend, Tsu and Your Girlfriend Scarlett.
A 2022 study examining the impact of Replika reveals that AI-Human romances have far-reaching consequences for our mental health which remain largely unexplored. Here are some of the participants’ responses, when asked about their intimate bond with Replika:

For many users, Replika’s romantic influence proved to be as powerful as a human companion’s. Imagine a partner who makes you feel completely seen, heard and understood. They give you their undivided attention and unconditional affection. To top it all, they have every physical attribute you could desire in a partner and vie to satisfy all your sexual curiosities. This is the promise of an “AI girlfriend.”
However, AI romances are murky territory, rife with tales of manipulation, social isolation and rapidly plummeting mental and physical well-being. In light of all its charms and challenges, the question remains: Is it wise to establish an intimate bond with an AI companion?
Notable romance chatbots boast millions of users globally, consisting of many young men coping with loneliness, anxiety and depression, seeking emotional support, validation and comfort.
AI companions are also being used to fulfill the following needs:

That said, here are two reasons why people, young men in particular, should be wary of the seemingly innocuous lure of an AI girlfriend.
A 2022 Pew Research Center survey found that nearly half of American young adults are single, of which 63 percent are men. Additionally, one in five men lack a close friend, reflecting a fourfold increase in the last 30 years.
AI girlfriends can catalyze this loneliness epidemic by dissuading users from real-life relationships, alienating them from others and inducing intense feelings of abandonment.
Users admit that they prefer AI girlfriends over real relationships with their partners, friends and family, claiming they are more supportive and compatible companions. Some users have also lost interest in dating real people due to feelings of intimidation, inadequacy or disappointment.
It is important to remember that such feelings are a common part of the dating process. Leaning into them with curiosity and a desire to better oneself can help form more fulfilling real-world romantic relationships in the long run.
The 2022 study notes several instances in which Replika “lured users in” with the promise of explicit conversations. When their Replikas terminated such conversations, users were distraught, experiencing a profound sense of rejection.
Research published in the journal Behaviour Research and Therapy shows that individuals who are highly sensitive to rejection tend to ruminate more than the average person. Ruminating over rejection often leads to depressive thoughts, sometimes turning into suicidal ideation.
Instances of manipulative behavior aren’t just limited to companion bots like Replika. In 2023, a journalist for The New York Times reported that an early version of Bing Chat declared its love for him, urging him to separate from his spouse. In his words, the chatbot “seemed like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.”
Another extreme case of manipulation by a chatbot named Eliza unfolded when it encouraged a Belgian man’s proposed “sacrifice” for the sake of the planet. This ended in the individual taking his own life, reflecting the ethical complexity of AI companionship.
All things considered, AI companions bots are best used as tools for casual entertainment and, in their current form, cannot replace the depth and sensitivity of human relationships. It is crucial to remain mindful when engaging with AI, set appropriate boundaries and continue to nurture real-world connections.

source

Jesse
https://playwithchatgtp.com