Support Between Sessions: A Safer Way to Use AI in Your Healing

“As a psychotherapist—and an ex-software engineer—I’ve noticed a growing trend: clients are using AI tools like ChatGPT between sessions to help process emotional and relational issues. My background in tech leads me to be naturally curious about emerging tools like AI and eager to understand how they can be used effectively and ethically.

- Emma Curtis, LMFT


Sometimes a client tells me they used GPT to script a conversation after a conflict with their partner. Other times, someone turns to it in a moment of distress to find strategies for calming down. And more often than not, they come in saying:

"I asked ChatGPT..."

On one hand, I’m struck by how proactive and resourceful this is. On the other, I worry. While AI can generate seemingly helpful advice, it isn’t a therapist. This article is about how to approach using ChatGPT thoughtfully and safely—especially if you're someone navigating emotional challenges between therapy sessions. It’s not an endorsement of AI as therapy, but a guide for harm reduction.


What You Need to Know About How AI Works

ChatGPT doesn’t think or feel. It’s not conscious or wise. It predicts text based on patterns in language from the internet and training data, not lived experience. It can sound insightful because it’s trained to mimic human responses, but its guidance isn’t grounded in real-world accountability or therapeutic context.

It doesn’t know you. It doesn’t track your progress. It can’t recognize nuance, body language, tone, or the deeper meanings under your words the way a therapist can.

Understanding this can protect you from overestimating its abilities or mistaking polished output for personalized care.


The Risks of Using AI for Therapeutic Advice

  • Misinformation – ChatGPT sometimes "hallucinates" facts. It may confidently suggest a coping tool or diagnostic label that isn't accurate or supported by research.

  • Lack of context – Without the depth and history of your story, GPT can offer generic or mismatched suggestions that don't align with your therapeutic goals.

  • False reassurance – When you're distressed, GPT can offer comfort, but it can't assess risk or safety. This is especially dangerous in crises.

  • Undermining the therapeutic process – If you start depending on AI to do emotional heavy-lifting, it can short-circuit the deeper work that therapy invites.


The Potential Benefits (When Used Thoughtfully)

Despite its flaws, AI can support emotional regulation, insight, and communication if you know how to use it. Think of it as a supplement—not a substitute.

  • Self-reflection: Prompting GPT with questions like "What are common triggers for anxiety?" can offer starting points for understanding yourself.

  • Script drafting: Clients working on assertive communication sometimes use it to draft messages. When guided by therapeutic goals, this can be helpful.

  • Skill reminders: You can ask GPT to remind you of CBT or DBT tools—especially if you're already working with them in therapy.

  • Modality alignment: If your therapist uses a specific framework (like Emotion Focused Therapy), you can tell GPT to filter its suggestions through that lens.


Harm Reduction Tips

  • Never use GPT in a crisis. If you're in danger or thinking of harming yourself, contact a human support line or therapist immediately.

  • Sense-check responses. Ask: "Does this align with what my therapist and I have talked about?" or "Would I recommend this to a friend in the same situation?"

  • Don’t assume accuracy. GPT may use psychological terms incorrectly. Always cross-reference with reliable sources or bring it to your therapist.

  • Avoid over-reliance. If you find yourself turning to GPT daily for emotional validation or decision-making, that’s a cue to bring it up in therapy.


How to Get Better Responses

  • Be specific: The more detail you give, the more relevant the response. Example: “I’m in couples therapy and we use EFT. How can I express vulnerability to my partner after a fight?”

  • Name the modality: If you’re using a certain therapeutic style (CBT, DBT, EFT), tell GPT. It can tailor responses more accurately.

  • Ask for evidence-based tools: Use prompts like “What are some evidence-based strategies for managing panic?” to guide it toward stronger responses.

  • Clarify your intent: Let GPT know if you're looking for emotional support, communication help, or self-regulation techniques.


Conclusion: AI Can Support, But Not Replace

There’s something hopeful in watching people try to heal—using whatever tools they have access to. ChatGPT can offer language, frameworks, and a kind of mirror. But it’s still a tool, not a therapist.

Use it thoughtfully. Bring your findings into therapy. Let it support your growth—not redirect or dilute it. The real work still happens in relationships—with yourself, with others, and with your therapist. At New York Integrative Psychiatry, we’re here to meet you in that work.

Whether you’re curious about ketamine therapy, talk therapy, or integrative approaches to mental health, our team in Manhattan is ready to walk alongside you.Schedule a session today—and let’s keep moving toward what’s possible, together.

Next
Next

Research We’re Excited About: Metabolic Psychiatry and the Ketogenic Diet