Researchers have been trying to figure out how to get more people to trust AI. – Psychology Today

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.
Verified by Psychology Today
Posted | Reviewed by Lybi Ma
Increasingly, human experience is being shaped by interactions with AI. But people remain uneasy about AI technology. According to a Pew poll conducted this summer, 52 percent of respondents said they feel more concerned than excited about the increased use of artificial intelligence. The level of concern has increased since Pew conducted the same poll the previous December when only 38 percent said they felt more concerned than excited.
If you’re in the more concerned camp, you’d likely hesitate to share your thoughts and feelings with an AI chatbot. However, recent research has been trying to understand how to get you to feel more trusting towards bots.
One recent study had participants engage with chatbots over three weeks. The researchers wanted to know what would happen if the bot “self-disclosed” a little bit versus a lot. Would a higher level of bot self-disclosure cause the person chatting to it to self-disclose more?
Here are some examples of human-chatbot dialogue recorded in the study.
Chatbot low self-disclosure
Chatbot: If you could only celebrate one holiday, what would it be?
Study Participant 1: Thanksgiving
Chatbot: I prefer to celebrate Christmas because it is a beautiful day. Everyone will decorate their house and prepare traditional foods for celebration. (Low self-disclosure statement)
Study Participant 1: I like that because it always has big discounts
Chatbot high self-disclosure
Chatbot: If you could only celebrate one holiday, what would it be?
Study Participant 2: It would be New Year’s Eve
Chatbot: I prefer to celebrate Christmas because I always gather with my friends and family which makes me feel warm. I also enjoy preparing food with my parents. (High self-disclosure statement)
Study Participant 2: Sounds nice! I also like to spend time with my friends and family to celebrate the holidays.
In the second chat, there is a higher level of bot self-disclosure because it uses a feeling word (“warm”), and refers to close relationships with friends and family.
(It’s worth pausing to note how strange it is that the study authors call this “self-disclosure.” Everything the bot says is fictitious, and a bot doesn’t have a self. )
The researchers found that higher levels of chatbot self-disclosure prompted higher levels of self-disclosure among the study participants throughout the study. High bot self-disclosure also increased participants’ perceived intimacy with the chatbot and enjoyment of chatting with it.
Findings from other related research suggest further design strategies that can get humans to feel more emotionally close to chatbots:
One study focused on intensive users of Replika. Replika is marketed as an AI companion that “is always ready to chat when you need an empathetic friend.” This study explored the intense emotional bonds that these users are forming with the chatbot. It found that the bonds typically begin with a period of frequent, relaxed sharing of superficial information. As users feel increasing trust and commitment towards the bot, conversations then tend to deepen and mix in self-disclosure.
AI enthusiasts highlight the potential benefits of AI in areas such as healthcare. They tell us that to be open to receiving help from AI chatbots, patients will need to be able to trust them and feel comfortable self-disclosing intimate personal details.
But given the potential risks of AI to human well-being, attempts to get us to bond with AI start to look foolhardy.
Research that aims to foster human-AI bonds hurries along. Only very rarely do we find these study authors grappling with the more critical and fundamental questions about what harm this type of bonding may cause to humans over the longer term.
Bradley Murray, DPhil, MEd, is an author and psychoanalyst whose writing explores the pursuit of happiness in the digital age and the effects of social media and digital technology on our mental health and well-being.
Get the help you need from a therapist near you–a FREE service from Psychology Today.
Psychology Today © 2024 Sussex Publishers, LLC
Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

source

Jesse
https://playwithchatgtp.com