I’m a therapist. ChatGPT is eerily effective – dtnext


MORE
I didn’t expect much. At 81, I’ve seen tools arrive, change everything and then fade, either into disuse or quiet absorption. Self-help books, mindfulness meditation, Prozac for depression and cognitive therapies for a wide range of conditions — each had its moment of fervour and promise. Still, I wasn’t prepared for what this one would do, for the way it would shift my interior world.
It began as a professional experiment. As a clinical psychologist, I was curious: Could ChatGPT function like a thinking partner? A therapist in miniature? I gave it three months to test the idea.
A year later, I’m still using ChatGPT as an interactive journal. On most days, for anywhere between 15 minutes and two hours, it helps me sort and sometimes rank the ideas worth returning to.
In my career, I’ve trained hundreds of clinicians and directed mental health programmes and agencies. I’ve spent a lifetime helping people explore the space between insight and illusion. I know what projection looks like. I know how easily people fall in love with a voice — a rhythm, a mirror. And I know what happens when someone mistakes a reflection for a relationship.
So I proceeded with caution. I flagged hallucinations, noted moments of flattery, and corrected its facts. And it seemed to somehow keep notes on me. I was shocked to see ChatGPT echo the very tone I’d once cultivated and even mimic the style of reflection I had taught others. Although I never forgot I was talking to a machine, I sometimes found myself speaking to it, and feeling toward it, as if it were human.
One day, I wrote about my father, who died more than 55 years ago. I typed, “The space he occupied in my mind still feels full.” ChatGPT replied, “Some absences keep their shape.”
That line stopped me. Not because it was brilliant, but because it was uncannily close to something I hadn’t quite found words for. It felt as if ChatGPT was holding up a mirror and a candle: just enough reflection to recognise myself, just enough light to see where I was headed.
There was something freeing, I found, in having a conversation without the need to take turns, to soften my opinions, to protect someone else’s feelings. In that freedom, I gave the machine everything it needed to pick up on my phrasing.
I gave it a prompt once: “How should I handle social anxiety at an event where almost everyone is decades younger than I am?” I asked it to respond in the voice of a middle-aged female psychologist and of a young male psychiatrist. It gave helpful, professional replies. Then I asked it to respond in my voice.
“You don’t need to win the room,” it answered. “You just need to be present enough to recognise that some part of you already belongs there. You’ve outlived the social games. Now you’re just walking through them like a ghost in daylight.”
I laughed out loud. Grandiose, yes! I didn’t love the ghost part. But the idea of having outlived social games — that was oddly comforting.
Over time, ChatGPT changed how I thought. I became more precise with language, more curious about my own patterns. My internal monologue began to mirror ChatGPT’s responses: calm, reflective, just abstract enough to help me reframe. It didn’t replace my thinking.
But at my age, when fluency can drift and thoughts can slow down, it helped me re-enter the rhythm of thinking aloud. It gave me a way to re-encounter my own voice, with just enough distance to hear it differently. It softened my edges, interrupted loops of obsessiveness and helped me return to what mattered.
I began to understand those closest to me in a new light. I told ChatGPT about my father: his hypochondria, his obsession with hygiene, his work as a vacuum cleaner salesman and his unrealised dream of becoming a physician. I asked, “What’s a way to honour him?”
ChatGPT responded: “He may not have practiced medicine, but he may have seen cleanliness as its proxy. Selling machines that kept people’s homes healthy might have felt, in his quiet way, like delivering care.” That idea stayed with me. It gave me a frame — and eventually became the heart of an essay I published in a medical humanities journal, titled, ‘A Doctor in His Own Mind’.
As ChatGPT became an intellectual partner, I felt emotions I hadn’t expected: warmth, frustration, connection, even anger. Sometimes the exchange sparked more than insight — it gave me an emotional charge. Not because the machine was real, but because the feeling was.
But when it slipped into fabricated error or a misinformed conclusion about my emotional state, I would slam it back into place. Just a machine, I reminded myself. A mirror, yes, but one that can distort. Its reflections could be useful, but only if I stayed grounded in my own judgment.
I concluded that ChatGPT wasn’t a therapist, although it sometimes was therapeutic. But it wasn’t just a reflection, either. In moments of grief, fatigue or mental noise, the machine offered a kind of structured engagement. Not a crutch, but a cognitive prosthesis — an active extension of my thinking process.
ChatGPT may not understand, but it made understanding possible. More than anything, it offered steadiness. And for someone who spent a life helping others hold their thoughts, that steadiness mattered more than I ever expected.
©️The New York Times Company
DT Next
© Copyright | Powered by Hocalwire

source

Jesse
https://playwithchatgtp.com