Why AI Will Never Replace Human Psychotherapy
There's a lot of conversation at the moment about artificial intelligence and mental health. Apps that offer CBT-based exercises. Chatbots that check in on your mood. Platforms that promise therapeutic support at any hour of the day, with no waiting list and no awkward silences.
And I understand the appeal. Access to mental health support in Ireland is genuinely difficult. Waiting lists are long, private therapy is expensive, and there are parts of the country where a local therapist isn't available. If technology can close some of that gap, that matters.
But I want to be honest about what AI can and cannot do. Because I think the conversation is moving faster than our understanding of it, and some of what's being marketed as therapy isn't therapy at all.
What therapy actually is
This is worth starting with, because it gets lost in the noise.
Psychotherapy is not primarily about techniques, information or the right sequence of questions. It's a specific kind of relationship, one with boundaries and a clear clinical purpose, but a relationship nonetheless. And what happens inside that relationship, the experience of being genuinely met by another person, of having your full story witnessed without judgment, of feeling safe enough to say the thing you've never said out loud before, is the core part of the process.
The research on this is consistent across decades. The single strongest predictor of therapeutic outcome isn't the modality being used, but it’s the quality of the therapeutic relationship. It's what therapists call the alliance, the sense that you and your therapist are on the same side, working toward the same thing, and that the person across from you actually knows you.
An AI cannot offer that. It can simulate its language, but simulation is not the same thing.
The nervous system knows the difference
This matters particularly in the kind of work I do. When I'm sitting with someone who is processing a traumatic birth, or carrying grief after a pregnancy loss, or struggling to bond with their baby in the way they expected to, what's happening in the room is not just cognitive. It's physiological.
The nervous system is relational. We regulate each other. A calm, attuned human presence can literally help to settle an activated nervous system in a way that words on a screen cannot. This is not a vague or romantic idea. It is grounded in decades of neuroscience, in the work of Stephen Porges on Polyvagal Theory, in what we understand about co-regulation and the social engagement system.
When I notice that someone's breathing has changed, or that they've gone quiet in a particular way, or that something shifted when they mentioned a specific moment in their birth story, I can respond to that in real time, with my full human attention. I can slow down, or sit with the silence, or gently name what I'm noticing. That responsiveness is not something that can be coded.
The particular problem with AI and vulnerability
There's something I find genuinely concerning about the direction some of this technology is taking, and I want to name it directly.
Therapy works because it's safe to be vulnerable in it. That safety is built on several things: the therapist's training and ethical accountability, the legal and professional frameworks that govern the relationship, and the fact that a real human is on the other side, bound by duties of confidentiality and care.
When someone is in real distress, when they're having thoughts of suicide, or disclosing abuse, or spiralling in a way that needs clinical judgment, the response they receive matters enormously. An AI system has no clinical judgment. It has pattern recognition. Those are not the same thing. And the consequences of getting it wrong in those moments are serious.
I'm not saying AI tools have no place at all. Psychoeducation, mood tracking, guided breathing exercises, and information about local services - these things can be genuinely useful, particularly as a bridge while someone is waiting for a therapist. But they are not therapy, and they should not be presented as such.
What gets lost in translation
There's also something more subtle that I want to name, something that's harder to quantify but that anyone who has been in good therapy will recognise.
Therapy changes you in ways that don't come from information alone. The experience of being truly heard, perhaps for the first time, by someone who is fully present with you, is itself transformative. The experience of saying something shameful and having it received with steadiness rather than horror. The experience of your therapist remembering, weeks later, the name of the person you mentioned in passing, because they were actually listening, and because you matter to them in the specific way that a real human relationship allows.
These are not small things. They are often the things that people carry with them long after therapy has ended.
An AI will remember your data. That is not the same as being remembered.
What grief demands
There's a phrase that has stayed with me: grief demands a witness.
It's not enough to feel it alone. Something about the full weight of grief, whether that's grief after a pregnancy loss, grief around a birth that didn't go as hoped, grief about the person you were before you became a parent, requires another person to be present with it. The catharsis, the release that we're often reaching for when we finally allow ourselves to feel something fully, doesn't happen in a vacuum, but it happens in a relationship. In front of someone else who can hold it with you without flinching, without trying to fix it, without looking away.
I've sat with people who have carried something for years, who have cried alone in the shower, who have held it together in front of their partners and their families and their colleagues, and who finally, in a therapy session, let it come. And what happens in that moment is not just emotional release. It's something more fundamental. It's the experience of not being alone with the hardest thing you've ever carried.
An AI can generate words of comfort. It cannot be present with your grief. And for grief especially, presence is everything.
A note on the Irish context
In Ireland, we are still building a culture where seeking help for mental health is normalised. We're getting there, slowly, but there's still a strong undercurrent of "sort yourself out" thinking, particularly among older generations and in more rural communities.
I worry that the growth of AI mental health tools could reinforce that tendency, giving people something that looks like support without the substance, and making it easier to avoid the real thing. Real therapy asks something of you. It's not always comfortable. But that discomfort is often where the most important work happens.
What human therapy offers that nothing else can
To put it plainly: a skilled therapist brings their full humanity to the work. Their training, yes, but also their intuition, their warmth, their capacity to sit with uncertainty, their ability to hold hope for you when you can't hold it yourself.
They bring the accumulated experience of every client they've ever sat with, every moment of human suffering and resilience they've witnessed, in a way that informs their clinical judgment without ever reducing you to a pattern or a category.
They bring their own humanity, carefully held in the background, but present. And that presence, quiet and consistent and real, is one of the things that makes the work work.
A final thought
I use technology in my practice every day. All my sessions are online via Zoom, and I'm genuinely grateful for what that makes possible, particularly for new parents who can't easily leave the house, or for people in rural Ireland who would otherwise have no local access to mental health resources.
Technology as a vehicle for therapy is not the same as technology as a replacement for it. The distinction matters, and I think it's worth saying clearly.
If you've been relying on an app to manage something that feels bigger than an app can hold, that's worth paying attention to. Real support is available.