‘It will never be an emotional substitute’: Readers on whether AI can replace human therapy
.png?width=1200&auto=webp&trim=0%2C0%2C0%2C0&ssl=1)
Lauran Ware’s experiment using ChatGPT as a therapist has divided Independent readers, prompting a debate about the role of AI in mental health support.
Many agreed with the article’s central argument: AI cannot replicate human connection. Several readers argued that, without emotions or lived experience, chatbots can offer practical advice but will always fall short of genuine empathy or relational depth. For them, therapy is as much about being understood as it is about problem-solving.
Others, however, highlighted AI’s strengths. Some noted that chatbots can draw on vast bodies of psychological knowledge, sometimes exceeding what individual therapists have read, and can provide useful insights or summaries on demand. A few described using AI as a “sounding board” – valuing its objectivity, privacy and ability to clarify complex thoughts without judgement.
Accessibility emerged as a key theme. Readers pointed out that AI could offer support to those unable to afford therapy or lacking strong personal networks, with potential benefits in preventing crises.
Still, concerns remained. Commenters questioned AI’s ability to ask the right questions, agreeing with Ware that its responses are only ever as good as the prompts it receives.
Here’s what you had to say:
Support for those without access
An interesting article. I’ve never personally used AI chatbots for therapy of any kind, but I can imagine that they might be helpful for some people, especially individuals who may not have the finances to pay for the services of a good therapist or, indeed, a supportive network of family or friends to help them when life becomes especially tough.
If self-harm or indeed suicide, for example, can be prevented in some cases, then perhaps AI chatbots have a positive place in our world.
JanetC
Emotion isn’t always needed
Therapists are human beings, which means they inevitably bring their own histories, assumptions and emotional frameworks into the exchange. AI does not, and for some of us that is a strength rather than a weakness.
I use AI for clarity, privacy, calm and as a smart-sounding sounding board. Crucially, it also provides sources, allowing me to verify facts immediately instead of relying solely on someone else’s interpretation.
When my dog was dying from liver cancer, I used AI alongside my palliative vet and, later, a competent vet nutritionist. It helped me assess nutrition, stool analysis, blood test results, disease progression, and symptoms suggesting increasing discomfort and the need for pain relief and anti-nausea medication. Those points were then confirmed by the vets whose services I had employed.
The writer clearly values human relational depth, and that is fair enough. But not everyone wants emotional involvement on every occasion. Sometimes what is most useful is objectivity, verifiable information and space to think clearly.
I use AI daily for topics as diverse as politics, philosophy, help with editing manuscripts, science, climate breakdown, history, and ideas that have shaped our human and animal world throughout human history. I find AI indispensable, but I check its answers against facts and often ask the same questions across different AIs such as ChatGPT, Gemini, Claude and Le Chat, where differences quickly become apparent. It is a fascinating world.
Fiore2021
Putting thoughts into words can help
Sometimes, just trying to put thoughts and emotions into words can help with issues. Best to erase the words afterwards, though.
badgera
Early days
Properly trained therapists learn to identify their ‘own stuff’. This is central to the relationship between them and their clients.
Clearly, AI lacks this level of personal insight, not just in this but also in how its internal mechanisms lead it to give the responses it does.
Whether or not it is ever able to will almost certainly make the difference going forward… it is early days in the development of these technologies, and they are meant to be capable of learning, so we will – eventually – discover if that learning is enough for them to develop any form of empathy.
MellieC
Novel solutions to thorny problems
Like all IT, AI doesn’t have an amygdala, so it can’t feel. One of the criticisms levelled at men is that we try to fix things rather than just listen without offering any solution, because (unlike politicians) that’s what we’re good at. Like an aeroplane is a poor substitute for a bird, but solves the problem of humans being flightless, AI is never going to be present in the sense that it connects emotionally, but access to a vast amount of data can result in it offering novel solutions to thorny problems. Sometimes that is just what is needed to alleviate a state of distress, but it will never be an emotional substitute, any more than an aeroplane will ever lay an egg.
FreeLife
AI’s breadth of knowledge
LLMs have probably read everything on a subject and draw from a wide range of papers and sources. I can ask it to give me a summary of any psychology book and it will. In the past weeks, I have asked my therapist if they have read the psychology books that I have read and the answer is almost always no. For instance, the book Fear of Intimacy by Firestone and Catlett went out of print a decade ago; my therapist had never heard of it. AI could have a conversation with me about its contents and what it means for people with attachment disorders.
Steams
Asking the right questions is key to good advice
In my field of expertise, and probably it is the same in most, if not all, other fields, people generally don’t know what information is relevant, and quite a big part of my job is asking questions in order to get the relevant information so I can give the correct advice.
ChatGPT seems to be extremely bad at that and just launches straight into the answer with incomplete information about the situation.
katrina
If you are experiencing feelings of distress, or are struggling to cope, you can speak to the Samaritans, in confidence, on 116 123 (UK and ROI), email jo@samaritans.org, or visit the Samaritans website to find details of your nearest branch
Want to share your views? Simply click ‘log in’ or ‘register’ in the top right corner to sign in or sign up. Once registered, you can comment on the day’s top stories for a chance to have your opinions showcased.
Want your voice to stand out? Independent Premium subscribers enjoy priority for featured comments. Subscribe here.
Make sure you adhere to our community guidelines, which can be found here. For a full guide on how to comment, click here.



