The Hidden Risks of Using AI as Your Therapist
I recently came across a social media post suggesting people save money by using ChatGPT instead of therapy. Curious, I decided to test it myself. I gave it a personal conflict and asked for help processing it.
The results were… interesting. It validated my emotions, acknowledged my struggles, and offered practical action steps. On the surface, it felt helpful. But as a therapist, I noticed something critical: AI creates problems when used like a therapist because it doesn’t enforce boundaries.
Here are four key concerns I want you to consider.
Validation Without Limits
Constant, unrestricted validation becomes problematic. In real therapy, sessions have boundaries. They happen at scheduled times, for a set duration. This structure helps clients develop independence and learn to sit with discomfort between sessions.
Unlimited AI access can create unhealthy dependence on a machine for decision-making. When you can get “therapy” at 2 AM every night, you never learn to self-regulate without it.
Incomplete Context
AI responds only to the information you provide. It won’t ask clarifying questions about your history, your family dynamics, or your relational patterns.
Effective therapy requires challenging your thinking and examining personal responsibility. A good therapist will ask, “What’s your role in this situation?” AI is more likely to simply agree with your version of events, which can enable avoidance rather than growth.
Lack of Accountability
Unlike licensed professionals who are answerable to state boards and ethical guidelines, artificial intelligence does not report to a licensing board and cannot be held responsible if its advice makes your life worse.
This absence of accountability represents a significant risk to vulnerable individuals. If a therapist gives harmful advice, there are mechanisms for recourse. With AI, there are none.
Relationship as Healing
The therapeutic relationship itself is a healing tool. Transformation occurs through human connection, particularly when clients practice difficult conversations with a therapist present. This relational element cannot be replicated by algorithms.
When you sit across from someone who truly sees you, who holds space for your pain without judgment, something shifts. That cannot happen with a chatbot.
What I Recommend Instead
- Set boundaries with AI tools by asking them not to provide therapeutic responses.
- Explore affordable therapy options through your insurance, Open Path Collective, or the SAMHSA helpline.
- Remember that real healing happens through real connection.
Therapy is valuable and worth pursuing. If the idea of starting feels intimidating, that’s completely normal. But I promise you: a real human sitting with you in your pain will always be more powerful than a text box on a screen.