Thu, May 01, 2025
If the Covid-19 pandemic taught us one thing, apart from the fragility of human life, it was that most work which is done within the four walls of an office can also be done from within the confines of our home.
During the lockdown, this shift extended to psychotherapy as well. As the lives of people came to a standstill, many sought therapy for the first time, while those already in therapy continued their sessions remotely.
*Note: Psychotherapy is a process that involves talking to a trained mental health professional to address emotional, psychological, or behavioural issues.
The advent of generative artificial intelligence (genAI) in 2022 made us look at therapy beyond just the physical setting of sessions. Companies and non-profits primarily engaged in providing mental health services saw an opportunity in combining genAI with therapy.
These companies claimed that talking to an AI-based therapist can be an alternative to ‘talking it out’ in a room with a certified therapist. In India, companies like Wysa and Infiheal, and many others, have developed AI chatbots tailored to support individuals facing mental health challenges.
While Wysa claims that its chatbot provides an emotional bond as deep as that with a human therapist, Infiheal makes no such claims about its chatbot called Healo.
AI Therapists: A Mental Health Revolution?
“Healo is like Google, but better,” Srishti Srivastava, founder and CEO of Infiheal, claimed in her conversation with The Secretariat. “But it will not be able to give the empathy of a human being. We have tried to make the AI as humanistic as possible.”
Launched in January 2024, Healo has been trained on real-life conversations between a human therapist and their clients. It has garnered over 10,000 visitors so far, of which 65 per cent have been women. It’s accessible and doesn't require a user to sign up in order to chat with it.
Infiheal was also recently in the news when Prime Minister Narendra Modi mentioned the company in his monthly 'Mann Ki Baat' (Speaking from the heart) address to the nation.
The merits of AI chatbots are reflected in various studies. One said that 80 per cent of folks who chatted with ChatGPT for mental health advice think it’s a fantastic stand-in for regular therapy. Another report showed that chatbots can zap away some symptoms of depression and anxiety.
However, experts consulted by The Secretariat offered a more cautious perspective. They agreed with the findings to an extent but emphasised that chatbots fall short of providing the essential support required during severe mental health crises.
For example, if a person is in the middle of a panic attack, they perhaps wouldn’t have the physical or the mental capacity to type prompts to access the necessary support from a chatbot. In such cases, Srivastava said Healo offers to connect the client to an in-house human therapist.
Why Are People Turning To AI For Therapy?
“A lot of people try other things before coming to therapy. Self-help books are one of them. An AI chatbot is another,” Garima Garg, a Delhi-based psychotherapist, told The Secretariat. “However complex an AI is, it is still just a tool. It cannot replace the human therapist.”
Garg has a loyal base of 24 clients and she sees three to four of them every day. Most of her sessions take place online, a trend she says has intensified since the pandemic.
She initially saw five patients a day, but as her practice evolved, she limited herself. In contrast, an AI chatbot isn't constrained by such limitations. When asked if any of her patients have shifted over to the other side, she laughed and said, “No, I don’t think any client of mine has even mentioned AI in any of my sessions.”
However, she did acknowledge that AI provides the comfort of anonymity to people who find it difficult to discuss things related to their sexuality, sex life, identity and other areas that are often stigmatised.
The Secretariat spoke to another psychotherapist, Shivam Sagar, who believes that while AI cannot replace a human therapist, it can be beneficial in very specific areas. He divided mental health services into three parts: counselling, psychotherapy and psychoanalysis.
In counselling, a mental health professional will provide a supportive environment to a client so that they can explore the reasons behind the problems they are facing. While this kind of counselling requires rigorous skills, these skills are transferable and can be emulated by advanced AI, which learns at an accelerated rate.
“But when we come from basic counselling to psychotherapy, certain things change,” said Sagar.
AI May Struggle To Replicate Subtle Emotional Work
In psychotherapy, the mental health professional needs to be well-trained in emotional bonding. Emotional bonding between a therapist and a client refers to the strong, trusting connection that only develops session after session.
“While AI can intellectually process and address emotional needs to some extent, there are moments when a simple gesture, a smile, or even silence from a therapist can be more healing than any insight or solution,” said Sagar.
And when it comes to **psychoanalysis, an AI chatbot, however advanced it might be, is not currently capable of performing the required analysis or therapy in the same way a trained human psychoanalyst can.
**Note: Psychoanalysis is a complex therapeutic process that involves understanding the unconscious mind, interpreting dreams, exploring past experiences, and building a deep, empathetic relationship between the analyst and the patient.
“As an analyst, one tries to analyse at the moment and it requires a great deal of creativity,” explained Sagar. The psychoanalyst needs to interpret at every moment why or what their client is doing or saying.
And while AI excels in predicting text using algorithms and machine learning, psychoanalysis remains unpredictable. No amount of prior knowledge can predict the complexities of each client, said Sagar.
Human Therapists Also Need Counselling
Therapy operates with a trickle-down effect. It involves more than just a client processing their emotions by talking with a therapist. A therapist likely has their own therapist or counsellor who guides them, monitors their work, and evaluates their mental and emotional well-being.
“Therapists who work in the field of psychoanalysis should definitely see a counsellor,” said Garg. “Through this kind of counselling, not only do the mental health professionals evolve but the relations with their patients also evolve.”
This ensures that they can provide the best care while maintaining their own mental health. This layered support system enhances the overall effectiveness of therapy for all clients, similar to how an AI software updates itself by training on more data to improve its effectiveness.
“We have to undergo the experience of analysis ourselves and only from that experience can we analyse anyone else,” said Sagar who has been seeking counsel for six years. “It’s simple. Reading about how to ride a cycle and actually riding it is different. That kind of knowledge an AI cannot transmit.”
When it comes to mental health, there is no room for error. The disadvantages of an AI therapist giving you mental health advice are many, like it can’t prescribe medications, can misdiagnose and cause privacy concerns. A few words of empathy can make a lot of difference, but what if the AI therapy chatbot starts ***hallucinating?
***Note: Hallucinations are instances in a large language model where it generates information that is incorrect, nonsensical, or fabricated. In the case of a chatbot, it can give plausible-sounding but inaccurate or entirely invented responses.
And AI chatbots do have a nasty record of hallucinating. In June 2023, an American nonprofit called the National Eating Disorders Association (NEDA) announced the closure of its national helpline for individuals struggling with eating disorders (ED) and instead launched an AI chatbot called Tessa to help people.
The move backfired. Tessa started giving weight loss tips to people seeking help for ED, which can exacerbate the very conditions it was meant to alleviate and put vulnerable individuals at greater risk.
So what do we do about the hallucination problem that the entire AI industry is grappling with?
“Setting guardrails around the chatbot is a very important step in ensuring that you minimise hallucination to a large extent,” said Srivastava. “That said, mistakes can happen.”
As the demand for therapists increases, not just in India but all over the world, it is crucial to recognise the shortage of mental health professionals. The NHS in the UK has waitlists of six to 12 months for appointments with mental health professionals, reflecting a widespread issue of high demand and insufficient resources.
“Whoever becomes a counsellor or therapist needs to sustain their own financial life,” said Sagar. “And for that, they need a market and respect for what they do. A society which stigmatises mental health is a primary problem we need to deal with.”
For India’s population of 1.4 billion, the recommended therapist-to-population ratio is 3 per 100,000 people. But we reportedly have only 0.3 psychiatrists, 0.07 psychologists and 0.07 social workers per 100,000 people. In developed countries, this number is around 6.6 psychiatrists per 100,000 people.
And as India grapples with this critical shortage, it’s important to consider alternative solutions. But can AI therapists be an alternative? One thing that is clear is that a chatbot can convincingly simulate a real relationship with you, but that’s different from actually having it.