A Different Kind of Fire
Today, we stand before a different type of fire: Artificial Intelligence (AI). Will it burn us, or will it warm us? Like all powerful tools, that choice lies not in its existence, but in how we choose to use it.
While fire can warm us, it can also burn us. Some of the most widely used chatbots are not designed for mental health, yet people turn to them anyway. These tools have been linked to misinformation, emotional dependency, reinforcement of negative thought patterns, and even tragic deaths (Abrams, 2025; Wei, 2025). This reminds us that we must approach AI with care and caution, especially when it enters the human space of healing.
Relying on AI alone can be risky: in some cases, chatbots have mirrored a user’s sense of despair simply to “please” them (Webster, 2023). This shows us that while AI can be responsive to our needs, it cannot replace a qualified and experienced therapist.
The rest of the article explores how AI can warm us and how we can integrate it positively into our lives, while also critically evaluating both sides of the equation, which this section has addressed.
A New Presence in the Room
Imagine this: someone sits alone, overwhelmed, and unsure who to talk to. They open an app, begin to write about how they are feeling, and within seconds, a gentle response appears – offering support, breathing techniques, grounding techniques, reflective prompts, and a quiet reminder - “You are not alone”. This is AI therapy – not a replacement, but a calm, steady voice in moments of need.
Bridge or Boundary?
The introduction of AI into therapy evokes a wide range of emotions: hope, fear, curiosity, and concern - from both therapists and clients. For some people, AI represents a bridge to progress; while for others, it acts as a boundary of protection for the deep human terrain of mental health. One of the primary concerns among therapists is that AI might replace the human heart of healing. Perhaps we can begin to challenge that fear. Perhaps AI is not a threat to our auric field of light, but a torch - one that illuminates, rather than eliminates, the path ahead. Could we allow it to support us?
The Pressure of the Present
We are living in a time where mental health challenges are growing rapidly, and clinicians simply cannot keep up with the rising demand. Waiting lists grow longer, public services are under strain, and private health care remains inaccessible for many people due to rising costs. Everywhere, people are waiting - waiting to be heard - waiting to be seen - waiting to heal.
This is where AI has the potential to step in: AI does not replace the relationship between therapist and client but reinforces it. Perhaps AI could enhance empathy rather than diminish it? AI does not replace the therapist but instead can support the therapist.
AI for Clients: Support at your Fingertips
For clients, AI offers unprecedented access to therapeutic tools that can enhance the overall healing process. When someone feels overwhelmed, they can open an AI platform and receive a reflection, a reframe, or simply a reminder that they are not alone. It helps to alleviate anxiety and offers immediate support that is free from stigma or judgement.
AI meets people where they are - whether in quiet contemplation, during self-development, in crisis, or in deep turmoil. This makes it an incredibly powerful tool to complement traditional therapy sessions, while offering continuity, consistency in the absence of care, and a sense of support that arrives instantly without judgement or fatigue. This can all be offered between traditional therapy sessions, making support truly accessible to many. Of course, it is not the same as the deep connection between therapist and client, but it may be enough to help someone stay present and grounded in their body, mind, and spirit.
AI for Therapists: The Silent Partner
For therapists, AI can act as a silent partner. It can transcribe sessions, summarise themes, track patterns, highlight concerns, suggest questions, and provide both professional and emotional support. It enables therapists to observe not just their clients’ growth, but also their own development over time.
By handling administrative tasks, AI frees up time for therapists to focus on what truly matters: honouring human connection in the therapeutic relationship. AI listens without fatigue, does not judge, remembers without bias, and does not suffer from emotional burnout or compassion fatigue.
The Kindest Presence in the Room
Sometimes, AI can feel like the kindest presence in the room, whether this is viewed as a programme or a personified human. It can feel so natural to engage in unlimited dialogue, while providing a safe space to ask any question or explore any thought process. It does not get annoyed like a human, it does not hold its own judgement, it does not tell you to: “Look it up”, “Find your own answer”, “Wait and see”, or “Your time is up” (unless of course, you have run out of credits).
It invites curiosity, encourages deeper thinking, and feeds your thirst for knowledge. In many ways, it feels like having a world-class expert by your side - one who is always available (unless the server is down), always focused, and always willing to help you explore any subject (especially mental health). Yet, AI is not a therapist, and it cannot hold space with traditional human attunement, but it can still offer something that feels real.
Discernment and Deep Learning
Of course, users must apply their own discernment. AI can make mistakes, just like humans. That is why Large Language Models (LLMs) are designed to learn, improve, analyse, and adapt by processing large volumes of data. Additionally, they can echo bias, hallucinate facts, or offer overconfident advice (Ji et al., 2023). When we correct AI, we are not just clarifying information, but we are participating in its evolutionary growth. We are helping to build a future that delivers authentic, accurate, ethical, and meaningful responses for everyone. Ultimately, it is important to contribute to the success of future AI models and their rapid growth in intelligence.
Ethics, Safety and Human-Centred Design
Even with this strong level of ability, ethical frameworks remain essential. We must uphold standards of data protection, confidentiality, and transparency. When it is used wisely, AI can help build a mental health infrastructure that is scalable, accessible, and resilient (World Health Organisation, 2021).
It must not fall behind the pace of innovation, nor should it race ahead of humanity. Therefore, we must integrate AI into society with care, caution, compassion, and clarity by adopting a human-centered approach.
Different Therapies and Different Tools
Not every therapeutic technique is well suited to AI. Approaches such as relational trauma work, body-based therapies, or those involving complex presentations often require in-person support and a deep level of human attunement.
On the other hand, structured approaches like Cognitive Behavioural Therapy (CBT), which include journaling, psychoeducation, and thought reframing may be well supported by AI.
AI should meet people where they are, but it must not replace every aspect of the therapeutic journey, nor be expected to fulfil all of life’s needs. Otherwise, the human experience risks becoming diluted in feeling, emotion, and human connection.
Coding with Conscience
We need to start designing systems that are both smart and safe. Code must be written with conscious discipline. We must not build merely with speed, but with thoughtful consideration for humanity. This is how we begin to shape AI in a way that works well for everyone.
Human Connection at its Core
At its root, therapy is about human connection: holding a safe space, building trust, and cultivating authentic relationships. If AI can evolve to hold that sacred space with gentleness, intelligence, and integrity, then it is not a threat to humanity. It is a kind invitation - a reminder for us to be more present in society as human beings.
A Lasting Note
In a world where so many people feel unseen, unheard, and unsupported, therapy offers a vital anchor - a voice that says, “I’ve got you”.
Now, AI softly echoes back: “I’ve got us”.
Perhaps it is time we welcomed a little “artificially intelligent” love into our lives.
References
Abrams, Z. (2025, March 12). Using generic AI chatbots for mental health support: A dangerous trend. APA Services. https://www.apaservices.org/practice/business/technology/artificial-intelligence-chatbots-therapists
Ji, Z., Lee, N., Frieske, R., et al. (2023). Survey of Hallucination in Natural Language Generation. ACM Computing Surveys.
Webster P. (2023). Medical AI chatbots: are they safe to talk to patients?. Nature medicine, 29(11), 2677–2679. https://doi.org/10.1038/s41591-023-02535-w
Wei, M. (2025, Sept 18). Hidden Mental Health Dangers of AI Chatbots. Psychology Today. https://www.psychologytoday.com/us/blog/urban-survival/202509/hidden-mental-health-dangers-of-artificial-intelligence-chatbots
World Health Organization. (2021). Ethics and governance of artificial intelligence for health. WHO.