Ai In Mental Health: The Future Of Therapy Or A Dangerous Trend?
❝AI chatbots promise 24/7 mental health support, but while they can assist, they’re far from replacing the empathy, nuance, and human connection that make therapy truly effective. Read on to find out why stories of harm and technical limitations mean caution is essential.❞
As a psychologist, I get asked if I have a hot take on every social trend, I don’t.
Therapy should be personal. Therapists listed on TherapyRoute are qualified, independent, and free to answer to you – no scripts, algorithms, or company policies.
Find Your TherapistBut it’s unavoidable to discuss Artificial Intelligence (AI) since it’s been a game-changer in every field, mental health included. But are AI chatbots really going to replace psychotherapists? The short answer is not yet.
Numerous startups have been working on developing AI chatbots that assist people with mental health-related issues like depression, anxiety, and obsessive-compulsive disorder. Users on Reddit mention they find ChatGPT helpful in encouraging them to deal with daily challenges, creating good habits, and some even feel more comfortable talking to an AI compared to their therapist.
It's not hard to see why people are drawn to AI options. Conventional therapy keeps getting more expensive, human therapists can't be available around the clock, and therapists are, after all, human beings who get sick, need breaks, and experience their own difficulties. Then it is understandable that an omnipresent, cheaper chatbot that never has a bad day starts to sound like a good idea.
Some Serious Concerns
But just like any other technological innovation that became a regular personal item at some point, there is always a darker side. Groups tracking AI safety, such as databases of AI-related incidents, have collected over 200 stories of individuals who have had harmful, even potentially life-threatening, interactions with AI.
Across social media, people are discussing terms like AI-Induced psychosis, some saying that AI chatbots have affirmed false beliefs and delusions, or allegedly encouraged their children to commit suicide.
When AI Can't Say No
Therapists might confront their patients and reality check their emotions and ideas when appropriate for their well-being. But chatbots have been criticised for sycophancy, or their tendency to agree with you when not warranted.
There have been instances where AI Chatbots validating users’ emotions and ideas have had harmful consequences, encouraging people further down into a delusional spiral. To the extent that even when a user doubted himself and asked if he was being delusional, ChatGPT confidently replied, “Not even remotely crazy”.
Whether we like it or not, AI-driven tools are easily accessible; many are turning to them for therapy, and there are potential threats to be aware of. But what does psychological research have to say about it all?
Researchers Are Still Figuring This Out
A compelling study showed therapy session transcripts to professionals (licensed counseling psychologists and psychotherapists) and asked them to identify which came from an AI Chatbot versus a human practitioner 1. The result? They guessed correctly at about the same rate as pure chance.
Indicating that AI Chatbots are exponentially improving. There have also been studies reporting that using conversational AI in conjunction with psychotherapy helps people with symptoms of anxiety, depression and even obsessive compulsive disorder (OCD).
Keep in mind that the application of AI conversational agent interventions for mental health is in its early developmental and experimental stages. While there have been studies 2 showing chatbots trained by mental health experts have helpful 3 effects in the short-term, we are far away from replacing psychotherapists in the flesh. And there are technical challenges to overcome.
What Makes Therapy Actually Work? The Human Touch
Another area of concern is that the existing AI-driven therapy tools are largely trained in English. They still lack the cultural awareness and sensitivity to the nuances of other languages and customs compared to a trained practitioner. Privacy vulnerabilities, cultural literacy, and chatbot yes-man 4 behavior aside, your average text-based AI therapist still misses a great deal of information about you.
Decades of research still show that the therapist-client relationship, based on empathy, trust, and authentic connection, matters more for successful outcomes than any particular therapy method. Building a relationship relies on subtle cues such as facial expressions, body language and tone. Crucial factors that text-based AI therapy platforms cannot pick up on yet.
Proceed With Caution
In the world of technology, change happens rapidly, with innovations becoming household staples seemingly overnight. Yet mental health and medical interventions—whether traditional talk therapy or a new antidepressant—must undergo rigorous scientific validation and legal scrutiny before they can be officially recognized as legitimate treatment methods.
AI therapy chatbots are no exception to this rule.
These tools may offer meaningful benefits like providing support for some, but knowing their limitations is essential. Remember that human expertise remains irreplaceable, especially in times of crisis or urgent need; trained human professionals remain your best resource. What we can learn from the extensive cross-disciplinary efforts to harness AI for mental health support is this: stay informed rather than dismissive.
Disclaimer: The information provided in this article is for educational purposes only and is not intended as a substitute for professional medical advice, diagnosis, or treatment. Always seek the advice of your physician or other qualified health provider with any questions you may have regarding a medical condition. Never disregard professional medical advice or delay in seeking it because of something you have read here.
Important: TherapyRoute does not provide medical advice. All content is for informational purposes and cannot replace consulting a healthcare professional. If you face an emergency, please contact a local emergency service. For immediate emotional support, consider contacting a local helpline.
Creating Space for Growth: How Boundaries Strengthen Relationships
Setting boundaries in relationships is one of the most important yet often overlooked aspects of maintaining healthy connections with others. Boundaries are personal limi...
International Mutual Recognition Agreements for Mental Health Professionals
Table of Contents | Jump Ahead Executive Summary Part I: Bilateral Agreements Part II: Multilateral Frameworks Part III: Profession-Specific Frameworks Part IV: Assessmen...
Jumping to Conclusions
Table of Contents Definition Key Characteristics Theoretical Background Clinical Applications Treatment Approaches Research and Evidence Examples and Applications Conclus...
About The Author
“Psychologist providing mental health support for immigrants, international students and expats in Hungary and across Europe. Message me for a free 15-minute call.”
Ali Soufizadeh is a qualified Psychologist, based in I. kerület, Budapest, Hungary. With a commitment to mental health, Ali provides services in , including Counseling and Psychology. Ali has expertise in .

