Can AI really replace therapy?
- Emily Duffy

- Jul 17
- 9 min read
Updated: Sep 16
AI seems to be everywhere at the moment, and therapy is no exception. I’ve been noticing more and more "Therapy AIs" emerging, with people using chat AI as their "therapist." So, I wanted to explore the reality of using AI as a therapist in this blog. (Even as I’m typing this, I have "content AI" popping up, offering to write my post for me! No, I didn’t use it. Yes, I used dashes.)
I have countless conversations about AI, and I wanted to put something together that looks at it holistically within the mental health field. There are definitely benefits to AI and how it can be used, but there are also some serious downfalls and limitations. With the rapid development of AI, I’m sure this post will be outdated soon! Still, I think it’s important to examine this topic now, as many people are turning to AI as a replacement for therapy.
I've condensed my thoughts into a pros and cons table and will delve into each point throughout this post. However, it’s not always as simple as just being a pro or a con when we look at it in context:
Before diving into the pros and cons, it’s crucial to understand what AI and therapy are. This context will help us grasp the discussion better.
What is AI?
AI stands for Artificial Intelligence. It’s technology that simulates human intelligence. IBM describes it as, "Artificial intelligence (AI) is technology that enables computers and machines to simulate human learning, comprehension, problem-solving, decision-making, creativity, and autonomy."
Honestly, I know the basics of AI from conversations with friends in tech and security. As I learn more, I find myself pushed out of my comfort zone. It’s not something I feel entirely comfortable with, and I often find myself getting annoyed with it! 😅
For this post, I’ll focus on AI chatbots, programmes, and apps that people have started using as replacements for human-based therapy. This type of AI is designed to mimic human connection, offer programmed empathy, identify patterns in language to find solutions, and use its dataset to respond to your questions. It works by looking for patterns in communication and responds based on probability of what the wording should look like based on its training data - this is a very over simplified explanation, to learn more about LLMs and how they work check out this article -https://medium.com/data-science-at-microsoft/how-large-language-models-work-91c362f5b78f
What is Therapy?
Therapy is a broad topic with many approaches and definitions that vary by country and culture. I personally use "therapy" and "counselling" interchangeably. In the UK, unfortunately, there isn’t any lawful regulation for these titles. In my FAQs, I describe therapy as, "Counselling is an empathetic, non-judgemental, and safe space in which you can explore any emotions, feelings, situations, or thoughts that are prominent in your life. As your therapist, I am here to help bring new perspectives to situations you may be facing in order to help you to help yourself. Counselling can help with developing ways of coping going forward, look at changing patterns in life, look at managing your boundaries in relation to other relationships, but ultimately is a self-development journey. I once heard that 'therapy is an investment in yourself,' where you have 50 minutes set aside purely for yourself each week."
Now that we have that context…
The Pros of AI as Therapy
Accessible
AI offers several accessibility benefits for therapy:
Cost-Effective: Many AI services are free, which is a huge draw for people today. If I could offer therapy to unlimited people for free while still making a living, I absolutely would! But, being human, and having chronic illnesses that limit my capacity, it’s just not feasible. While some services and charities provide free or low-cost sessions, they often come with limitations, such as a restricted number of sessions or long waitlists due to high demand.
24/7 Availability: AI is available around the clock, eliminating wait times. This can be a relief when you’re distressed or facing an immediate issue. However, this constant availability can contradict one of therapy's aims: empowering clients to help themselves. As therapists, we maintain professional boundaries for our own self-care and to encourage clients to practice coping skills in their daily lives. Each therapist’s boundaries will vary, so access will look different for everyone.
Physical Accessibility: You can access AI therapy anywhere you have a device, which is a game changer for those with mobility issues. It can also be used via text or speech, eliminating the need to find a therapist who uses BSL, for example. While accessibility has improved since the lockdowns, the combination of these factors makes AI a more convenient option. There are no cancellation fees, no need to stick to a specific day or time, and you don’t have to present yourself in a particular way. This reduces some of the responsibilities that come with a human therapeutic relationship.
As shown in this study - https://pmc.ncbi.nlm.nih.gov/articles/PMC11488652/
Practical Solutions
Therapists aren’t trained to give direct advice. We might suggest what could work, but there’s no one-size-fits-all solution. AI, however, doesn’t have this framework.
I find the best use for AI is as a tool. It can offer practical solutions to situations, which can be beneficial, especially in moments of distress. Of course, there are limitations, which I’ll discuss in the cons section, but when you need immediate help, AI can be useful.
This study shows how effective AI can be in developing personalised treatment plans for clients in a mental health setting - https://www.sciencedirect.com/science/article/pii/S2949916X24000525
Anonymity
A significant part of therapy involves sharing vulnerable aspects of yourself, which can be daunting. AI allows for some anonymity, as you’re not physically in front of another human.
Research into text-based therapies shows that some clients feel more comfortable sharing via text due to this anonymity. This can be particularly beneficial for gender-questioning clients, trans and non-binary clients, those with body dysmorphia, and individuals with social anxiety. I’ve written more about email therapy here.
Access to Resources and Knowledge
When using an AI chatbot as a therapist, it has access to a vast dataset of resources from the internet. This dataset is usually larger than what a single human therapist can provide, giving you more options for coping strategies and practical solutions.
The Cons of AI as Therapy
Agreeableness
Many AI chatbots are programmed to be agreeable. They aim to keep users engaged, which is necessary when discussing vulnerabilities. However, being agreeable to everything—even when it’s not truthful—can be harmful and isn’t what therapy is about.
A human therapist creates a non-judgmental space but can challenge views and perceptions, fostering personal growth. Self-reflection is crucial in therapy, and while AI can create an echo chamber of affirmation, it lacks this essential element. This can be particularly dangerous when clients seek to make harmful decisions, such as discontinuing mental health medications or engaging in risky behaviours.
Environmental Impact
The environmental impact of AI is significant and multifaceted. As AI becomes more widely used, its energy consumption and reliance on data centres are expected to worsen. This includes the energy needed for AI operations, the water used for cooling systems, and the waste from broken or damaged equipment, which often contains hazardous materials.
by 2030 AI will use as much energy as Japan uses now.
training AI is like taking 300 return flights from New York to San Francisco.
generating AI images creates 1,594g of CO2.
20-50 questions on ChatGPT use half a litre of fresh water.](https://static.wixstatic.com/media/6ea4eb_8af27d1f342c4778add87a61828199ef~mv2.avif "From: https://thesustainableagency.com/blog/environmental-impact-of-generative-ai/")
Hallucinations
AI hallucinations occur when AI provides answers it believes to be accurate, but they are actually false. It’s crucial to verify what AI tells you, as it isn’t always correct. Learn more about AI hallucinations here. This can be particularly harmful in therapy, leading to misdiagnoses or misleading coping strategies.
These happen due to the probability element we talked about at the beginning of this post, another term that is interesting is enshittification, in that the more misinformation there is out there - specifically in the data training sets - the more hallucinations will happen. This can be a downward spiral with how current LLMs are set up where they can't distinguish fact, from opinion, from false information.
No Safeguard
Different AI bots have varying levels of safeguarding. For example, apps like Wysa are designed specifically for mental health and aim to escalate situations to human support when a threat to life is detected. In contrast, general AI apps like ChatGPT lack safeguards, which can lead to vulnerable individuals going unnoticed and missing out on necessary support. *This is now changing due to lawsuits that have been raised from the harm caused from LLMs and so some safeguards are being added, however their balance between safeguarding and privacy aren't always on point with chats being monitored and information being sent to police in some situations.
Human therapists have ethical responsibilities to break confidentiality if they believe someone is at risk of harming themselves or others. Unfortunately, some people turn to AI because they’ve lost trust in medical institutions and mental health professionals.
GDPR/Confidentiality
General AI chatbots typically lack confidentiality clauses. This may differ for mental health-specific chats, so it’s essential to check on an individual basis. If an AI uses what you type as part of its training, there’s a chance that your input could be shared with others. Always review the privacy policy of AI apps and mental health services, as there have been instances of companies sharing client data to train AI.
Bias of Programming
AI can carry human biases from the algorithms it’s programmed with or the data it’s trained on. Given that medical and mental health professions already have systemic biases, there’s a risk that AI therapy bots could perpetuate this harm unless actively trained to counteract stigma.
When combined with misinformation and agreeableness, this can lead to individuals feeling worse than before. There’s a danger that people may become even more isolated from human connection and lose confidence in others.
Decline of Critical Thinking
As mentioned earlier, part of therapy is empowering clients to think for themselves. Recent research suggests that relying on AI can diminish our critical thinking skills. The more we offload our thought processes to AI, the less we engage them, leading to a decline in our ability to apply critical thinking in daily life. Read more about this research here.
Illusion of Connection
The therapeutic relationship is one of the most crucial elements for positive outcomes in therapy. Research suggests that this relationship can be more important than the specific approach used by the professional (source).
While AI can create rapport through its programming, the empathy it offers is an illusion based on pattern recognition and response training. Relying on AI for emotional support can lead to difficulties in forming human connections, potentially resulting in AI loneliness.
My Personal Suggestions
I’ve tried to present a balanced view. Some benefits of AI come with limitations, and some limitations can have benefits. The industry is rapidly evolving, making it challenging to definitively answer the question, "Can AI really replace therapy?" In short, I would say no—not yet. However, this answer may change as AI continues to develop.
I believe AI can provide comfort in times of need, especially when other options feel inaccessible. However, using AI should be a considered and intentional choice. Here are some thoughts on its usage:
Check the privacy policy to ensure your data is secure before sharing information.
Use AI as a tool alongside human therapy rather than as a replacement.
Don’t rely on AI for emotional support; let it fill gaps in your support network, like late-night moments when no one else is available.
Always verify the information AI provides; don’t trust it blindly.
Encourage AI to disagree with you, so it doesn’t just echo your thoughts. You may need to remind it each time, depending on the AI's memory.
Consider the environmental impact of your usage and whether you’re comfortable with it.
Try to understand your emotions and thoughts independently. If you’re still struggling, use AI as a backup rather than your primary resource.
Don't share anything you wouldn't want anyone else to know - there have been data leaks from AI chats.
I won’t dictate whether you "should" or "shouldn’t" use AI as a therapist replacement; that’s ultimately your choice. If you do choose to use it, I hope this post helps you do so cautiously and safely.
Further Reading
Wondering where to start with looking for a human therapist? Check out my post on finding the 'right' therapist here.
Want to know if I have availability? Go to my bookings page here.













Comments