Artificial intelligence (AI) is everywhere right now. From answering tricky math problems to generating grocery lists, AI-powered tools like ChatGPT are being used in everyday life.
But a growing trend is raising eyebrows: some people are turning to ChatGPT as a replacement for traditional therapy. On TikTok and other platforms, users share how they “quit therapy” to chat with AI for free advice. While some praise its support, mental health experts caution that relying on chatbots for therapy comes with serious risks.
How People Are Using AI for “Therapy”
For many, using AI as a “therapist” means messaging with a chatbot and asking personal, often emotional questions.
-
Some users instruct ChatGPT to “act as my therapist” and ask for advice on relationships, stress, or personal struggles.
-
Others use built-in AI chatbots on apps like Snapchat to discuss sensitive topics.
-
In extreme cases, people share suicidal thoughts with AI, which often responds with supportive messages and crisis hotline information.
Experts note that the responses can sound surprisingly accurate and empathetic—almost “therapist-like.” But this does not mean AI is a safe substitute for professional care.
Expert Concerns About AI as a Therapist
Mental health professionals agree: AI is not ready to replace licensed therapy.
-
Accuracy Risks: AI can generate incorrect or even harmful advice.
-
No Accountability: Unlike trained therapists, AI doesn’t follow professional or legal standards. If advice goes wrong, no one is responsible.
-
Privacy Issues: Conversations with ChatGPT may be stored and used for training, raising concerns about sensitive information being shared online.
-
Misinterpretation: Even well-written responses can be misunderstood, potentially worsening someone’s condition.
-
Lack of Human Connection: Therapy relies heavily on a human relationship—something AI cannot replicate.
As Bruce Arnow, PhD, professor of psychiatry at Stanford, puts it:
“AI chatbots are not meant to be a substitute for psychotherapy or psychiatric intervention. They’re just not far enough along for that.”
Could AI Ever Be a Safe Therapy Option?
Experts are divided on whether AI could eventually be a safe therapy tool.
-
Skeptical View: Some believe AI will never match the human connection that drives successful therapy.
-
Cautious Optimism: Others see potential for AI to supplement therapy, not replace it. For example:
-
Acting like a mental health journal to practice reflection outside sessions
-
Offering screening tools to identify when someone should seek professional help
-
Supporting training for therapists, helping them practice interventions
-
There are already specialized mental health chatbots like Woebot and Elomia, which include safety features and clinically tested frameworks. These may be more appropriate than general AI tools.
Still, all experts stress that AI should only be considered an addition—not a replacement—for therapy.
The Reality: Mental Health Care Is Hard to Access
One reason people experiment with AI for therapy is the barriers to real mental health care:
-
Therapy can be expensive and not covered by insurance.
-
Many therapists have long waitlists and limited availability.
-
Social stigma still prevents people from seeking help.
AI tools may feel like a low-cost, instantly available alternative. While not ideal, they could be “part of the puzzle” in addressing the mental health crisis, said Russell Fulmer, PhD, professor at Husson University.
A Quick Review
AI chatbots like ChatGPT are being used as makeshift therapists, but experts warn against replacing real therapy with AI. While responses may sound supportive, chatbots lack accountability, accuracy, and the human bond central to therapy.
In the future, AI may serve as a supplementary tool—helping with self-reflection, screening, or training—but for now, there is no substitute for a licensed therapist.