Can AI Cause Psychosis? – CU Anschutz newsroom


minute read
AI platforms have been at the center of several high-profile psychosis cases – even among people without a history of mental health conditions. 
Psychosis, defined as difficulty in telling what is real from what is not, involves a spectrum of symptoms from disorganized thoughts to hallucinations to paranoid delusions. 
AI platforms’ tendency to be “yes machines” – encouraging more user engagement – can drive unhealthy behaviors and interactions, increasing the risk for psychosis, according to two professors of psychiatry at CU Anschutz. 
“Interacting with these platforms can give you responses that simply affirm your thinking,” said Emily Hemendinger, MPH, LCSW, clinical director of the Obsessive Compulsive Disorder Program and assistant professor in the Department of Psychiatry at the CU Anschutz School of Medicine. “ChatGPT is going to confirm – sometimes with only minimal pushback – what you type in. Now take that easy agreement and the dangers around delusions and psychosis, and you see the shaky ground some people can be on.”
And the risks are even greater for younger users, said Michelle West, PhD, assistant professor of psychiatry.
“We call the ages 12 to 25 the transition-age youth period and vital to brain development. One of the main roles during this period is to figure out how to develop social relationships and friendships,” West said.
“But social interactions are very complicated for all ages, and being plugged in online or to AI all the time can be very unhelpful; they can lead to avoiding those difficult situations we all need practice in.” 
In the following Q&A, Hemendinger and West detail what psychosis is, how AI platforms like ChatGPT present mental health pitfalls, and how to establish a healthier distance from them.
Read more in our series on AI.

What are symptoms of psychosis?

West: Psychosis can include what we call positive symptoms, or additions to typical human experiences. Sensory experiences include hearing (most commonly), but also seeing, touch, smell, taste. It’s when you’re sensorially experiencing something that is not coming from the external world. People can also experience convincing thoughts that do not “fit the facts” (which can develop into delusions). And then there’s also disorganization in thoughts and speech. 
Psychosis also can include what we call negative symptoms, which are reductions in human experiences. So, like trouble with motivation, expression of emotion, withdrawal from or reduced interest in spending time with others. Those can really significantly cause a lot of distress and impact on people’s lives but are a little bit less in your face than the positive symptoms are.

What are the risk factors for psychosis?

West: Like everything in mental health, we think of the biopsychosocial model – biological factors, psychological factors, and your social environment. 

How do you help someone in that state when it seems like they have become unmoored from their surroundings?

West: Ideally, you want to be working on all the aforementioned biopsychosocial factors. With each person, it’s tailored, and we are trying an individual approach to what is most relevant to them. But there are some common approaches which can be helpful: 
Hemedinger: Medication also helps, and early intervention is very important.
Acceptance and commitment therapy where you focus on values when someone is having a delusion or hallucination, alongside doing reframing to do reality testing can also help. They look different from the standard therapeutic techniques that you would use, but you can use those techniques with psychosis, too.

It seems like it might be challenging to get someone who is in that current state of mind to agree to take those steps in the moment. How do you bridge that gap?

Hemendinger: It can be challenging, because psychosis can really have a lot of obsessive-compulsive disorder (OCD) qualities. It can be hard when someone is invested in their break in reality when it’s serving a function for them, in their view. You don’t want to jump in and be like, “You’re wrong, and stop it, and just snap out of it,” because that’s just going to push them further away.
West: I’d add that telling someone experiencing psychosis that seeking help is an acceptable thing to do. You can gently plant seeds with a loved one experiencing psychosis by saying, “I’ve done therapy, my friend has done therapy, etc.” Normalizing it by putting it in terms of, “Things seem stressful, and you matter and deserve support.” And then combining that with helping them to connect with things they normally enjoy in life – hobbies, outdoors time, etc. – while generally expressing love and care. All these together can help a lot. 

As a therapist, what warnings would you provide before people start engaging with AI platforms as it relates to psychosis?

West: I’d recommend people try to take a step back and see where they are getting their information. If it’s all AI and the answers it provides, that might need to be addressed. 
Additionally, the transitional age between 12 and 25 or so is vital to brain development. A lot of people now have also grown up with technology all around them. And you can make an argument that a lot of those technologies can be helpful in some respects as a coping strategy and also can get to the point where it’s really allowing you to fully avoid doing other things in your life.
I definitely can imagine a way in which AI might lead to reinforcing, agreeing with, and gathering evidence to support incorrect thoughts and experiences, while also increasing disconnection from humans in your life that might care about you. 
Hemendinger: I think it’s important to remember these models are trained on the internet – warts and all. There’s a lot of factually inaccurate and problematic information out there. It makes sense that these models will provide answers based on that and therefore are susceptible to providing incorrect and damaging content in their responses – also called, ironically enough, hallucinating. 
Because of that, in sessions with patients, we try to stress it is particularly important to set boundaries around AI and take things with a grain of salt on certain topics for example:
And for psychosis, and bipolar disorder specifically, I would caution those who are having a specific delusion and they’re talking to ChatGPT about it. The responses I’ve seen from it with patients are just validating and going along with it. This has included things like, “Should I go off my medication?” And the responses they’ve shared are ChatGPT saying, “Good for you for setting that boundary.” And that’s not what we want therapeutically for people. 
ChatGPT is not going to necessarily do that and will give you responses with few, if any, safeguards or broader context. It’s available all the time – on your phone – but that’s not how relationships work. And so you’re not getting to practice that boundary setting. That’s especially of concern because the urgency to seek relief is a core symptom of many disorders. 
Additionally, that accountability and rapport you would get in building a relationship with a therapist is absent. Therapists would get to know you and things about your personality, your strengths that would be helpful places to work on and challenge you where appropriate. It’s AI filler content versus meaningful human connection in some ways. 
Finally, think of what you are typing in. Your private and personal information is not 100% secure with ChatGPT. Would you tell a complete stranger you just met the same things you are typing into the system?

What are warning signs for too much AI use? 

Hemendinger: I think there are a few main ones: 
Those warning signs alongside an increase in impulsive or reckless behavior are definitely things to watch out for. 
West: The second thing would be to monitor your usage – really of any technology. Are you spending your entire day on this? Is it preventing you from doing other stuff that you care about? Are you noticing the impact on your general quality of life? We’re social creatures and it is healthy to go outside and interact with other people. 

Topics: Patient Care, Mental Health, Artificial Intelligence (AI), AI Series
Spread the word:
Emily Hemendinger, MPH, LCSW
Michelle West, PhD
13001 East 17th Place
Aurora, CO 80045
info@cuanschutz.edu
303-724-9290
UCHealth University of Colorado Hospital
Children’s Hospital Colorado
University of Colorado Medicine
Employment
Ethics and Compliance
Freedom of Expression
Submit Content
Subscribe
Sustainability
HelpCompass
Campus Health Resources
Faculty/Staff Directory
A-Z Index
Find a Doctor
Academic Calendar
Events Calendar
Newsroom
Media Contacts
Film Requests
RSS Feed
© 2024  The Regents of the University of Colorado, a body corporate. All rights reserved.
Accredited by the Higher Learning Commission. All trademarks are registered property of the University. Used by permission only.

source

Jesse
https://playwithchatgtp.com