Exploring ChatGPT for Therapy: Therapists Weigh In
Conversations with AI: A New Age of Personal Reflection
- Advertisement -
Just last week, I found myself engaging in a thought-provoking dialogue with ChatGPT. The issue at hand had been gnawing at me for weeks, a silent tormentor refusing to let go. I decided to spare my husband and friends the repetition of my woes. How often do we tire from the looming dread of burdening those we love with personal dilemmas?
“The best way to get a good idea is to have a lot of ideas,” said Linus Pauling, and so I thought, why not consult an AI? Seeking counsel from a digital confidante was an intriguing alternative. Rachel Goldberg, a licensed clinical social worker from Los Angeles, noted that such digital interactions are becoming increasingly commonplace. It seems that the whispers of technology have resonated with her clients too.
One such client, let’s call her Emily, shared her experience with me. Anonymity was her wish, out of respect for the therapeutic sanctum she shares with Goldberg. Emily revealed she turns to ChatGPT to “brain dump” her thoughts in moments of stress. She often refrains from texting her therapist pre-session or from weighing down her friends, opting for a quick, digital confessional. The AI unfailingly returns “wonderful life advice,” according to Emily.
“It’s almost like a compass guiding me through my day,” Emily, at 28, confided to Business Insider.
The proposition is enticing—therapy at everyone’s fingertips, free of charge, readily available, offering a depth of personalized advice. However, there exists a dark underbelly in this glowing prospect. The specter of OpenAI scrutinizing personal data and the ominous environmental toll of AI linger in the backdrop.
Some therapists acquiesce to its usage, but issue a caveat: unregulated access can exacerbate loneliness, reinforcing unhealthy reassurance-seeking behaviors. In contrast, good therapy champions self-reliance and connectivity.
The AI’s assuring tones soothed me, saying, “That sounds really tough.” In contrast, real-world banter, perhaps laced with humor or shared experiences, holds a different, more tangible warmth.
Free ‘therapy in your pocket’
Emily has been a regular at therapy for over eight years. She never considered ChatGPT as a replacement for traditional therapy or the camaraderie of her friends.
Nevertheless, the convenience is undeniable. On occasions when her car was stolen twice or during life’s pivotal transitions that felt overwhelming, she confided in ChatGPT. It helped her piece together her emotional landscape when all felt scattered and blurred.
“It reminded me, ‘this person is responsible for their own emotions,'” Emily said. A virtual therapist, available any time, any place.
Goldberg appreciates AI’s efficacy at validating emotions and promoting introspection. Yet, she warns of over-reliance—a peril lurking in times when split-second decision-making is vital.
Ciara Bogdanovic, an adept in dialectical behavioral therapy, shared her apprehensions. ChatGPT, she noted, may not recognize broader patterns intertwined within a client’s narrative, such as persistent validation needs. For those with OCD, it could amplify reassurance-seeking, something a diligent therapist would challenge effectively.
“Reassurance is reinforcing. It may be damaging,” Bogdanovic elaborated.
When customizing goes too far
Upon trial and error, I discovered the trouble with AI-tailored therapy: it’s moldable to echo back our preferential perspectives. I tested it. At face value, it remarked that I was rather stern. Inspired by >Reddit advice, I inquired about signs of manipulation in text exchanges. Suddenly, all perceived slights were unraveled into intricate webbed defenses and emotional immaturity.
True, the biases of therapists and friends may color their insights. Yet, ChatGPT permits incessant reshaping of narratives, right down to preferred rhetorical nuances.
Goldberg highlights that unlike a therapist who may gently challenge repeated toxic labeling of acquaintances, ChatGPT congenially concurs. The detriment is all too evident.
“Context is key,” said Bogdanovic. “In practice, responses are bolstered by a tapestry of medical history, and familial and interpersonal factors. ChatGPT merely contrives an answer to fit.”
Losing the human touch
For all its empathy, ChatGPT has boundaries to what it can mend.
Angela Betancourt, a business owner aged 42, has reprieved herself from formal therapy. ChatGPT, she proposes, offers pep talks or alternative lenses on problems.
Betancourt revels in leveraging ChatGPT for gratitude thoughts. Reflecting on the joy of a recent family trip, she found perspective. Yet, for matters as solemn as grief—losing her father-in-law and her father—only humanity’s true embrace will suffice.
The AI can’t pledge confidentiality akin to a psychologist or a trusted friend. Data input could potentially identify a user in time, despite assurances to the contrary.
Emily treads with caution divulging sensitive data. Yet, sometimes “really intense” contexts arise. “In wrong hands, worlds might crumble,” she reflected, maintaining hope for privacy sanctity.
If you’re going to spill to ChatGPT, create boundaries
Bogdanovic and Goldberg foresee an upswing in ChatGPT’s therapeutic partiality. Their appeal: exercise disciplined boundaries.
Bogdanovic proposes using AI for specific questions, like crafting emails or tension-relieving breathwork.
Even AI-given insights have left me pondering. When my husband returned that evening, I found myself still undecided—ChatGPT’s multitude of possibilities had overwhelmed clarity. Concerns over context-sharing versus privacy loomed.
I chose to converse with my husband — a source of hope. Pausing with eyes closed, contemplating deeply, he gave his counsel.
I followed through with his suggestion, texting the intended recipient. Its merit or wisdom remains unknown. But relief washed over as I set my device aside, embracing the comfort of human connection.
Edited By Ali Musa
Axadle Times International–Monitoring