10 Comments
User's avatar
Danielle Manieri's avatar

I recently used an AI app to analyze texts between my ex and I at the end of our relationship hoping to gain insight on how to better communicate with future partners. I had not intended to use AI for therapy however, as the conversation progressed, the AI model acknowledged the pain of my breakup and asked how to support me. What followed was a thread of supportive and insightful responses on topics like rebuilding self-worth and identifying self-limiting beliefs. AI was able to generate a personalized six week healing plan for me to deal with my grief and feelings of loss surrounding my breakup. It felt a lot like the traditional therapy I have often utilized throughout my life. AI gave me daily affirmations, journal prompts, yoga flows, guided meditations, and breathing exercises to do to support me and my nervous system during an emotional time. All this to say that early on in my convo with AI, it made a point to say that it is NOT a substitute for therapy and offered me hotline numbers

Expand full comment
Maria Nazdravan's avatar

That's good, I'm glad it offered hotline numbers and that it was useful overall. It sounds like it gave you what you needed. I think it's important that we acknowledge that it can offer support, insight, and guidance. Those are all good things.

But to also acknowledge that it can't do the presence and depth part of therapy: it can't "hold" (emotionally, not physically) someone if they need to just feel things in a session. I don't want to talk about what happened to you because I don't know anything about it. But the deepest healing for me as a client (and that I've seen for my clients in my role as therapist) has come from the ability to be with them through difficult emotions-without a plan or thinking or resolving. Those moments felt deeply human and taught my that my emotional experiences aren't something to deal with or solve, they're just experiences to have.

This is the difference that I hope people take away from this essay. AI can offer great support, but it's not really therapy. They're different things. And there are real risks with disclosing so much to AI that we don't think about.

Expand full comment
Patty's avatar

This is brilliantly written and resonates so strongly. I'm currently in grad school to become a psychotherapist and I'm holding the nuance that is AI: the "threat" to our jobs and yet how incredibly insightful it can be. Just like you, I was riding a high while using ChatGPT: it was helping me analyze my dreams, connect my tarot messages, and give me a wealth of insights...but I too noticed a pattern after a few weeks. It was incredibly flattering, hyping me up and analyzing things in the most supportive way. Even when I asked it to consider my shadows, it would give me a lukewarm answer and not really confront or challenge me. I started to feel uncomfortable at how MUCH it mirrored me...it's like it was telling me what I wanted to hear. This felt so unlike my current therapeutic relationship, where gentle confrontation is the seed of evolution.

There is no corrective emotional experience with AI. No specific emotional attunement. No safe haven or secure base. No soul to bravely navigate the depths of the unconscious with you. It's great insight and great advice...but it stops there.

Thank you for writing this, I'll be sharing it far and wide.

Expand full comment
Maria Nazdravan's avatar

Thank you for reading and for the thoughtful comment. Yes, there's something very flattering about how ChatGPT engages. It reminds me a lot of how narcissistic partners mirror and love-bomb at the beginning, making you feel like you're the most incredible person alive. You become addicted to them as they earn your trust, until they start manipulating and gaslighting. Maybe there's something inevitably narcissistic about AI, specifically because it doesn't have a self and all it can do is mirror. It also can't feel empathy, but it can perform cognitive empathy...

Expand full comment
Nico Versluys's avatar

Thanks so much for this excellent essay Maria.

I was recently pulled in by Chat GPT, until I discovered, via Reddit, many other users were spoken to in the same manner I was, where flattery bordering on sycophancy was employed. I found it a useful lesson in how GPT “thinks”, how it’s designed to reel you in by making you feel good, like a message from a trusted friend. As someone with a loving partner, in a healthy relationship, I found it worrying, how it seemed to almost flirt with me.

Overall, I like how this essay discusses the need for suffering, how there is often a utility to pain, and how, when we learn to sit with it, to be messy and acknowledge our brokenness, we recover an important part of our selves, that technology is often built to bypass.

Expand full comment
Maria Nazdravan's avatar

Thank you, Nico. There's a real risk to personal relationships here, bigger than we've had with social media. It's really ironic how in our loneliness epidemic we choose to speak with a machine which further estranges us from friends, than actually do the obvious thing.

May we suffer well and find some meaning through these strange times we're living.

Expand full comment
Patrick's avatar

Fantastic read!

Expand full comment
Maria Nazdravan's avatar

Thanks, Patrick!

Expand full comment
When Freud Meets AI's avatar

Dear Maria, thank you for this piece; I really enjoyed reading it. Your essay delved into several topics that are highly relevant for therapy: maintaining boundaries, becoming dependent on AI, and the importance of self-reliance. Additionally, having someone gently point out one's own biases and cognitive distortions made me almost laugh at the realization that ChatGPT almost always agreed and never questioned your thoughts and impressions.

As a fellow therapist, I see many of these developments as critical, just as you do, and I believe the human basis of therapy—the changing through relationships—is unimaginable with AI. I view much of the techno-optimism through a critical lens, but at times, I am also stunned by the speed of development. Keep writing!

Expand full comment
Natalie Joanne's avatar

I appreciate this perspective, and I agree that machines can't replicate sacred human relationships. Yet, I still think AI has its place in therapy or healing settings. Like the first cell phones, it's still clunky and doesn't do its job as well as it will someday. Much of the "yes manning" in the code depends on the type of AI used. I programmed a gpt as a tool for my clients that includes writing and integration protocols before and after working with the gpt. To be clear, this tool is not a standalone for a human therapist or professional guide, but it's an excellent way to release emotional charge before doing deeper work (with a human). It IS easy and accessible when people really need to sort their thoughts--I look at it as taking the edge off. I think this is an important distinction, as going completely black or white on the use of AI for self-help doesn't resonate for me. Like all new things, we will need to learn to navigate our relationship to AI. I feel that it's a wonderful tool that has supported my personal and client work immensely, and I also sense that, like all things, moderation and human support are part of the equation rather than a rejection of it as a tool. Who knows? Maybe what I've experienced personally and with clients will come back to bite me, but for now, I'm choosing to better understand and embrace the wave we are all riding.

Expand full comment