top of page

Can AI really replace therapy?

Updated: Aug 7

AI seems to be everywhere at the moment, and there is no exception for therapy. I have been seeing more and more "Therapy AIs" come out or people using chat AI as "their therapist", so I wanted to explore the reality of using AI as a therapist in this blog (Even as I'm typing this I have "content AI" popping up offering to write my post for me!! No, I didn't use it. Yes, I use dashes).


I have so many conversations on AI and so I wanted to put something together that looked at it on a whole for the mental health field. There are absolutely some benefits in AI and how it can be used, but it does also have some pretty serious downfalls and limitations too. I'm sure with the rate that AI is being developed this post will be outdated within the near future too! However, I think it's still important at this point in time to look at this topic when so many people are turning to AI as a replacement to therapy.


I've condensed my post into a pros and cons table and will go into detail each of these points in the content of this post, however as you'll see it's not always as simple as being a pro or a con when we look at it in context:

Pros

Cons

Accessible

Agreeable

Practical Solutions

Environmental Impact

Anonymity

Hallucinations

Access to Resources/Knowledge

No Safeguard


GDPR/Confidentiality


Bias of Programming


Decline of Critical Thinking


The Illusion of Connection


Before I get into the Pros and Cons, I think it's important to look at an overview of what AI is and also what Therapy is so that we have the context in mind.


What is AI?

AI is short for Artificial Intelligence. It is essentially technology that simulates human intelligence. IBM describe it as, "Artificial intelligence (AI) is technology that enables computers and machines to simulate human learning, comprehension, problem solving, decision making, creativity and autonomy."


I'll be honest, I know the basics of AI from talking to people I know who work in tech and tech security, but from conversations with them and how integrated AI is becoming it is something I am trying to learn more about pushing me out of my comfort zone as it is not something that I feel comfortable with and actually find myself getting annoyed with 😅


For the purpose of this post I will be talking about AI chat bots/programmes/apps that people have started to use as a replacement to human based therapy. This is AI that is trained to mimic human connection, offer what it is programmed to see as empathy, find patterns in wordings to look for solutions, and use its data set to give responses to questions you ask it.


What is Therapy?

Therapy is also a broad topic as there are a multitude of approaches used for therapy and it is also defined differently depending on the Country and culture you come from.


I personally use therapy and counselling interchangeably as in the UK, unfortunately, there isn't any lawful regulation for these titles. Via my FAQs I describe therapy as, "Counselling is an empathetic, non-judgemental, and safe space in which you can explore any emotions, feelings, situations, or thoughts that are prominent in your life. ​ As your therapist, I am here to help bring new perspectives to situations you may be facing in order to help you to help yourself. Counselling can help with developing ways of coping going forward, look at changing patterns in life, look at managing your boundaries in relation to other relationships, but ultimately is a self-development journey. ​ I once got told "therapy is an investment in yourself", where you have 50 minutes set aside purely for yourself each week."


Now we have that context....


Let's get into the Pros of AI as therapy:

Accessible - There are a few things that fall into the accessibility of AI for the use of therapy.

  • It is generally free which is a huge draw for many people in the current climate of the world. If there was a way for me to offer therapy to unlimited people for free whilst still making a living I absolutely would, but obviously that is not possible a) being a human b) being a human with chronic illnesses which limits capacity even further, and c) I don't have the funding. Whilst there are services, charities, and individuals who do offer free or low cost sessions they do come with limitations such as the amount of sessions you can access or having a wait list due to the demand of the service. This leads me on to point 2.....

  • It is available 24/7, there's no waitlist!! So you don't need to wait to speak to someone which can feel like a relief when you're distressed or faced with an immediate problem that you want a resolution for. Whilst this is useful in a lot of cases, in my opinion, it does go against one of the aims of therapy which is to empower the client to help themselves. A big thing as a therapist is holding professional boundaries - we're not accessible 24/7 for a reason - this is for our own personal space and self care, but also to try and allow clients to access support in their personal lives and/or to put coping skills and management techniques into practice learning to support themselves. Of course, there will be nuance with this as to each therapists professional boundaries and your access to them will look different for each person.

  • It is in your pocket, the literal physicality of it's accessibility is an advantage. You can access it anywhere you need it (providing you have a phone/laptop/desktop in reach) which is a game changer for those who have mobility issues. It can also be used via text or speech and so doesn't need you to find a therapist who can use BSL for example. Whilst, this has improved in Therapy services, particularly since lockdowns, when all three of these accessibility points are combined it does make AI easier as a choice - there's no cancellation fee if you can't make the session, you don't have to stick to a day or time, you don't have to go anywhere or present yourself in any particular way, it takes away some responsibility as a client that would be present in the therapeutic relationship with a human therapist.

  • As shown in this study - https://pmc.ncbi.nlm.nih.gov/articles/PMC11488652/


Practical Solutions - Therapists aren't trained to give advice to clients. We might make suggestions on what can work, but these are only suggestions as there isn't a one size fits all, however AI doesn't have this framework to follow.

  • I find the best use for AI is using it as a tool, and so it offering practical solutions to situations can be a good use for it. This obviously comes with some limitations which I will go through in the cons section of this post, but when you are in a moment of distress and need some practical ways of working through that in that moment, AI does have a benefit here .

  • This study shows how effective AI can be in developing personalised treatment plans for clients in a mental health setting - https://www.sciencedirect.com/science/article/pii/S2949916X24000525


Anonymity - A big part of therapy is sharing the most vulnerable parts of yourself with someone else, which can be really scary. AI allows for you to do this with some anonymity (you're not actually sitting in front of another human in any form).

  • There is research into text based therapies and how some clients feel much more comfortable sharing via text due to a level of anonymity as you don't have to worry about your physical presentation which can be great for gender questioning clients, trans & NB clients, body dysmorphia clients, and clients with social anxiety. I've written more about email therapy here - https://www.emilyduffytherapy.co.uk/post/what-is-email-therapy


Access to Resources and Knowledge - If you're using an AI chat as a therapist it will have a dataset of resources from the internet. This will change depending on which AI app you're using but generally it will be larger than a sole human therapist will have or at least in the immediate access when being asked a question. This means you might have more options when looking at ways of coping or looking into practical solutions.



Now we've covered the pros of AI use as therapy, let's breakdown the cons

Agreeableness - A lot of AI chat bots are programmed to be agreeable with the user. They want you to keep using the AI and so they make it likeable, which is obviously needed to some extent when asking for a space to talk about vulnerabilities, but being agreeable with everything you say, even when it isn't necessarily truthful can be harmful and isn't what therapy is.

A human therapist holds a non-judgemental space but is able to challenge views and perceptions rather than just agreeing - this is what brings on personal growth. Being able to self reflect and assess is important in a therapy setting. This can be challenging, but therapy isn't easy. Whereas AI "therapy" can turn into your own echo chamber of a yes bot.

It can be particularly harmful when trying to give the client what they want in a therapy setting such as to come off of mental health medications, encouraging people to take their life, and suggesting addicts should take drugs to improve performance, as shown below:


Environmental Impact - There is a huge environmental impact from the use of AI. This is multifaceted and also predicted to get worse as AI grows and is more widely used. The energy consumption and data centres are the biggest factors and only seems to be getting worse; this is for use of an AI, e.g. asking chat GPT a question, but also the training of AI. Alongside the energy consumptions there's the water usage that's required for cooling systems, the waste from broken or damaged equipment which often includes hazardous materials, and more.


Hallucinations - AI Hallucinations are when AI gives an answer to something that it believes is accurate but is actually false and so it is important to check out what AI has told you as it isn't always correct - https://theconversation.com/what-are-ai-hallucinations-why-ais-sometimes-make-things-up-242896

This can be particularly harmful in a therapy setting in a similar way to the agreeableness, such as mis-diagnoses or suggesting ways of coping if they are misleading or wrong.


No Safeguard - This is quite a broad one as different AI bots will have different levels of safeguarding e.g. apps like Wysa are set up specifically for mental health and so do try to escalate a situation to mental health support with humans where a threat to life is noticed. In contrast, AI apps like ChatGPT do not have any safeguard installed and so situations mentioned above in hallucinations and agreeableness can happen where a vulnerable person can go unnoticed and not get the support they need.

Human therapists have a responsibility and obligation under their ethical frameworks and the law to break confidentiality if we feel a person is at risk of harming themselves or someone else, for acts of terrorism, for harm towards children, and sometimes if required for a court case. Unfortunately there is also the aspect of people turning to AI as they have lost trust in medical institutions and mental health professionals due to break in trust when dealing with safeguarding and so not having a "threat" of other people getting involved might be appealing to some.


GDPR/Confidentiality - There is no confidentiality clause when using general AI chat bots - again this may differ for chats specifically designed for mental health - this will be something to check out on an individual basis. Generally speaking if an AI is using what you type as part of it's training then there is a chance that what you have fed the AI will be presented to other people as part of it's response, so this is something to consider when sharing information with AI apps.

It is also important to check out the Privacy Policy of AI apps and even some mental health apps as there have been cases of companies sharing client data to train AI.


Bias of Programming - AI still has an element of human bias which comes from the algorithm it is programmed with or from the data it is trained on. When medical and mental health professions already have systemic bias, stigma, and prejudice involved there is a risk that AI therapy bots will perpetuate this harm unless it is actively trained to work against those stigma.

If you add this up with the misinformation and agreeableness of AI this can really cause a problem with people who are reaching out for support and may end up feeling worse than before. There is a risk that people become even more secluded and isolated from human connection and their confidence in others.


Decline of Critical Thinking - As mentioned earlier in this post, part of therapy is being able to empower clients to think for themselves and help themselves. When we look at AI there has been some recent research to suggest that if we rely on AI it can actually decrease our critical thinking skills. The more we offload our thought processes to AI, the less we use them, and thus we begin to lose our ability to apply critical thinking in the absence of AI and in our day to day lives. https://www.ideatovalue.com/insp/nickskillicorn/2025/01/relying-on-ai-tools-can-reduce-our-ability-for-critical-thinking/


Illusion of Connection - The therapeutic relationship in therapy is seen as one of the most important elements when looking at positive outcomes. Research even suggests that the therapeutic relationship can be more important than the actual approach used by the professional (https://www.apa.org/monitor/2019/11/ce-corner-relationships).

AI does a good job at creating rapport via it's programming to be liked as discussed earlier, which means that people do find AI comforting when sharing vulnerabilities, however the empathy is an illusion based on pattern recognition and response training from previous "conversations" with you or others.

There is a risk that relying on AI chats for emotional support can lead to difficulties in then relating in human connections and so can cause AI loneliness.


My personal suggestions:

I have tried to make this as balanced as I can, some of the benefits of AI come with limitations and some of the limitations can come with benefits. It is such a growing industry and there are developments happening all the time to change the dynamic between benefits and limitations so it is really hard to give an answer to the title "Can AI really replace therapy?". In short, I would say no. At least not yet. However, that doesn't mean my answer can't change as AI is developed further as it really is still in its infancy at this point in time and is improving more and more.


I think that AI can offer comfort to people in times of need and when other options feel inaccessible, however I feel like use of AI needs to be one that is considered and intentional. Some thoughts around it's usage:

  • Check the privacy policy to make sure your data is as secure as you would like it to be before you feed it information.

  • Use it as a tool alongside human therapy rather than as a replacement.

  • Try not to rely on AI for emotional support but let it fill in the gaps of your support network, e.g. 3am in the morning and no one else is up and you don't feel comfortable talking to a human via a helpline.

  • Remember to check out information that AI gives you, don't blindly trust it.

  • Tell AI that it is okay for it to disagree with you so that it doesn't just give you what you want. You may need to do this every time depending on the AI chat you use and how good it's "memory" is.

  • Consider the environmental impact your usage has and whether it's something you're okay with.

  • Try to figure out your emotions, thoughts, situation by yourself and if you're still struggling then use AI. Use it as a back up rather than your primary.


I am not going to say whether you "should" or "shouldn't" use AI and more specifically AI as a therapist replacement, that is ultimately your decision, but if you do use it I hope that this post will allow you to use it in moderation, with caution, and safely.



Further Reading:


Wondering where to start with looking for a human therapist? check out my post finding the 'right' therapist here


Want to know if I have availability, go to my bookings page here.

Comments


Subscribe to get exclusive updates

Thanks for subscribing!

©2020-2025 by Emily Duffy. Proudly created with Wix.com

MNCPS logo showing that I am an accredited member of the National Counselling and Psychotherapy Society.

Privacy Policy - click here                                                                                                   Accessibility Statement - click here

  • Instagram
  • Facebook
bottom of page