As I sit down to write this I am acutely aware of the ease with which I could input ‘Write me an article about Artificial Intelligence (A.I) and therapy’ into one of the many chatbots available; however, I’ll resist the temptation on this occasion…
I have watched the rise of A.I and its impact across many sectors with intrigue and concern. I find myself both excited at the prospect of the benefits of this sort of tool as well as nervous at the many ways in which it may be abused or could lead to the deskilling of us as creative beings. Whatever your opinion on A.I there is no denying that it has well and truly arrived and shows no sign of leaving anytime soon.
The Mental Health Sector has not escaped this revolution. Quite the opposite. According to a recent Harvard Business Report, ‘Therapy/companionship’ is now the number one use for A.I [in chatbot form], with the second and third uses being ‘Organise my life’ and ‘Find purpose’. I have had many conversations with clients, friends and family members who tell me they’ve spoken to an A.I chatbot or downloaded a “virtual therapist”. Understandably so: waiting lists can be long, costs can be high, and people are seeking help wherever it might be found.
I don’t dismiss this curiosity. A.I can offer real advantages. But I also believe that the essence of talking therapy, the human connection between client and counsellor, remains irreplaceable.
The Appeal: quick, accessible, and informed
The benefits of A.I support are easy to see. It’s available instantly, at any hour, and often at no cost. For someone in distress at three in the morning, or unable to access NHS services quickly, that immediacy can be invaluable.
A.I systems can also draw on an astonishing breadth of knowledge. They can synthesise thousands of psychological studies, clinical papers, and therapy models in seconds. No single human therapist could possibly hold all that information. For those looking for structured self-reflection, psychoeducation, or practical coping strategies, this can be genuinely useful.
There’s also the comfort of anonymity. Some people find it easier to open up when they’re not being observed, when there’s no perceived judgement. In that sense, a chatbot can feel like a neutral space, a place to talk without embarrassment or fear.
Finally, from a systems perspective, A.I could help ease the load on overstretched mental health services by providing low-level support or administrative assistance. Used responsibly, it could allow human therapists to focus more of their energy where it’s most needed.
The Concerns: risk, empathy, and what cannot be simulated
But these advantages come with significant caveats.
First, there are safety concerns. An A.I programme can misunderstand context, fail to recognise a crisis, or give well-meaning but harmful advice. As Professor Dame Til Wykes of King’s College London has warned, “A.I is not yet at the level where it can provide nuance and it might actually suggest courses of action that are totally inappropriate.” (The Guardian, 2025). The NHS has issued similar cautions, noting that chatbots “should not be used as a substitute for therapy” and may even be “harmful or dangerous” when relied upon for serious mental health needs (The Independent, 2025).
Then there’s empathy, or rather, the absence of it. Therapy is not just about words on a screen; it’s about tone, timing, presence and trust. A human therapist doesn’t only listen to what is said, but to what is not said: the silences, the tremors in the voice, the flicker of doubt or pain in the eyes. That is where attunement happens, and where healing often begins.
As Dr Roman Raczka, President of the British Psychological Society, has put it: “Artificial intelligence offers many benefits, but it should not replace essential human interaction.” (The Guardian, 2025).
Finally, there are questions about privacy, data protection, and dependence. Sensitive personal information shared with A.I tools may not always be handled within the same strict ethical frameworks that governs professional counsellors. If clients begin to rely heavily on an A.I companion, there’s a risk that it could quietly discourage the very thing many people need most…real, human connection.
Finding balance
At MAC, we believe A.I can play a constructive role in mental health care, but only as a partner, never a replacement. Used thoughtfully, it might support psychoeducation, offer reflective prompts, or bridge gaps while someone waits for therapy.
But the heart of therapy lies in the relationship: two people in a room (or a call), meeting each other with honesty, empathy, and courage. That living connection, the feeling of being truly seen and understood, is something no algorithm can replicate.
In the end, while A.I may help us to know more, it is the human bond that helps us to heal.
References
Harvard Business Review (2024) How people are using AI: From productivity to purpose. Harvard Business Publishing.
Independent (2025) NHS warns AI chatbots are ‘harmful and dangerous’ substitutes for therapy. The Independent, 18 May. Available at: https://www.independent.co.uk/news/health/chatgpt-ai-chatbots-therapy-nhs-b2821008.html
Raczka, R. (2025) ‘Artificial intelligence offers many benefits, but it should not replace essential human interaction’, The Guardian, 11 May. Available at: https://www.theguardian.com/society/2025/may/11/ai-therapists-cant-replace-the-human-touch
The Guardian (2025) Experts warn therapy AI chatbots are not safe to use, 7 May. Available at: https://www.theguardian.com/technology/2025/may/07/experts-warn-therapy-ai-chatbots-are-not-safe-to-use


