Can AI replace therapists? A closer look at mental health chatbots

by | Jul 12, 2024 | Education, Treatment

Rapid heartbeat, sweaty palms, and a blend of anxiety and euphoria – some may recognise these as the sensations of falling in love, while others might associate them with the high from drugs. Both perspectives are valid. The similarities and differences between these intense experiences are striking. However, how our brains handle the emotional roller coasters of love, rejection, addiction, and recovery is complex. Understanding these processes could provide valuable insights for developing more effective addiction therapies.

Mayte

Expert Corner with Dr. phil. Mayte Parada

As chatbots like Earkick or Woebot gain popularity, mental health professionals are divided—some see them as a solution to the therapist shortage, while others warn they could undermine the core of traditional therapy. The truth, as usual, is likely somewhere in between.

A recent poll showed that 80 % of users found ChatGPT to be a viable alternative to traditional therapy. But is this truly the case? Therapy has always relied on human connection—trust, empathy, and emotional understanding. As we consider whether AI chatbots could—or should—take on the role of therapists, it’s essential to weigh their potential against the unique human touch therapists provide. While AI offers promising advancements in mental health care, it also raises significant ethical and technical challenges that require careful consideration.

The evolution of mental health chatbots

Mental health chatbots are not a new concept. Over half a century ago, an MIT computer scientist developed ELIZA, a basic programme, that mimicked the responses of a psychotherapist. Since then, the push to create digital therapy alternatives has gained momentum. Today, a growing number of AI-powered chatbots are dedicated to supporting mental wellness in various capacities, such as screening patients with standard questionnaires. For instance, the UK’s NHS uses Limbic to diagnose certain mental health conditions, while the WHO’s digital health worker, Sarah, offers automated counselling, allowing users to engage in cognitive behavioral therapy sessions with an AI chatbot.

(How) Does it work?

An AI chatbot engages with patients by conversing about their experiences and emotions, often suggesting exercises to try between „sessions“. The effectiveness of these tools isn’t just hypothetical. Research shows that these chatbots can substantially reduce symptoms of depression and distress, at least in the short term. In one study, AI analysed over 20 million text conversations from actual counselling sessions, successfully predicting patient satisfaction and clinical outcomes. Additionally, it has been demonstrated that AI can detect early signs of major depressive disorder by analysing subtle cues like facial expressions during routine phone use and typing patterns.

Bridging the gap

Could digital platforms serve as a valuable solution to the current gaps in mental healthcare? The potential is certainly there. AI chatbots can reach a much broader population, especially as the availability and accessibility of mental health centres continues to dwindle. Additionally, by handling administrative tasks such as monitoring mood or generating reports, it could free up mental health professionals to focus more on patient care. Moreover, by offering a stigma-free and cost-effective way to access support, these chatbots can make mental health care more accessible to those who might otherwise shy away from reaching out.

Where technology falls short

The aspiration – or fear – that AI might one day replace therapists overlooks the core of what makes therapy effective: the human connection. While chatbots can process vast amounts of data and generate text quickly, they fall short in this vital area of effective psychotherapy. Human interaction is vital for learning how to relate to others, and repairing ruptures in relationships requires another person to guide that experience. AI, despite its ability to mimic conversational patterns, lacks this nuanced understanding and empathetic connection that human therapists provide. Traditional therapy is not just about applying techniques or algorithms – it is about creating a space for personal growth, self-exploration, and deep emotional rapport. Moreover, a significant portion of human communication is nonverbal: body language, tone of voice, and facial expressions. Without the ability to read and respond to these ever so subtle cues AI won’t succeed in assessing a patient’s emotional state and respond appropriately.

The role of trust, bond, and connection in therapy

Human connection is a fundamental need, especially in in a world where loneliness has been declared a global health threat. While it’s possible to connect with a chatbot, the quality of that connection are questionable. Interestingly, research shows that chatbots do offer a sense of anonymity and confidentiality that may encourage users, who are hesitant to seek in-person help, to open up. However, trust is only one piece of the puzzle. The therapeutic bond, defined by trust and rapport, is critical for effective treatment. It is built on reliability, consistency, and a deep commitment to the client’s well-being, and can take months or even years to fully develop. Constrained by algorithms, AI might offer a surface-level interaction but it lacks the ability to form this deep connection essential for true therapeutic growth.

The risk of inadequate and harmful advice from AI chatbots

AI chatbots, while increasingly sophisticated, are still far from providing the reliable guidance required in clinical settings where trust and accuracy are paramount. A significant concern is their potential to offer misguided or even harmful advice, such as suggesting that someone abandon their treatment or, in more alarming cases, endorsing self-harm. A stark example of these risks came to light when the US National Eating Disorders Association (NEDA) replaced its human-staffed helpline with a chatbot named Tessa. Relying on scripted responses, Tessa provided dangerous advice, including weight-loss tips to users vulnerable to eating disorders.

Data safety and regulatory concerns: a persistent challenge

What happens in the therapy room stays there—but can the same be said for chatrooms? Despite decades of constant technical progress, the industry still operates in a regulatory gray area, with governments struggling to catch up. The public remains largely unaware of how tech companies collect and use the data fed into chatbots, raising serious concerns about potential breaches of confidentiality. Moreover, many chatbots claim to incorporate therapeutic techniques like CBT and mood assessment tools, labelling themselves as “clinically safe” mental health supports. However, these marketing tactics are often misleading, as most of these apps have not been approved for medical use. This exploitation of users’ trust in the healthcare system encourages them to share personal and medical information, which could then be used for purposes beyond therapy, such as being sold or reported to authorities. Since chatbots are not classified as medical devices, they are not bound by the same confidentiality rules that govern healthcare professionals.

Conclusion: forming a digital therapeutic alliance

The debate over AI chatbots in mental health care continues but may be moving toward common ground. Tech developers argue that their tools are meant to complement, not replace, human therapists. Meanwhile, many mental health professionals recognise that AI apps can be a valuable first step in a person’s mental health journey, offering support, monitoring, and skill-building.

The future of therapy may lie in a blended approach: Especially until data safety and regulatory concerns are fully addressed, AI’s primary role will likely focus on supporting tasks such as:

  • Accessible support, available 24/7, overcoming financial, societal, and logistical barriers.
  • Routine check-ins to provide consistent support and monitor mental health Assisting psychiatrists in selecting the right medications based on patient data.
  • Translating conversations in real-time for non-native English speakers.

However, AI chatbots cannot replace real therapists in areas like:

  • Addressing complex mental health issues, where expertise and deep understanding are essential.
  • Personalising treatment based on an individual’s unique background and experiences.
  • Offering genuine empathy and emotional connection.
  • Managing crisis situations requiring immediate, nuanced responses.
  • Adapting dynamically to the flow of a session based on subtle cues.

Barbara Thoma

Originally trained as a lawyer, Barbara transitioned into the mental health services sector, where she has built extensive experience over the past decade. Leveraging her analytical skills and attention to detail, she has provided expert guidance, consulted on mental health programmes, conducted workshops, and analysed policies to improve client outcomes. Concurrently, she excels in corporate communication, copywriting, translation, and editing, offering multilingual services in German, English, Spanish, Korean, and Italian. As a freelance communication expert, she works with prestigious mental health institutions and other renowned international organisations.

0 Comments

×