020 7580 4224
72 Harley Street, London W1G 7HG
info@psychiatrycentre.co.uk | | | | |
SITE MENU

Blog

The London Psychiatry Centre / Blog  / Using ChatGPT As A Therapist Is Risky – And Here’s Why
Using ChatGPT As A Therapist Is Risky – And Here’s Why

Using ChatGPT As A Therapist Is Risky – And Here’s Why

Research from Sentio University has revealed that an astonishing 63% of individuals surveyed turn to ChatGPT for personal advice. They reported that it benefitted their mental health, with 73% using it in ways that indicate anxiety management, 72% for depression, and 33.3% for trauma-related issues. Here, we take a look at the benefits, risks and long-term implications of turning to chatbots for emotional support. The question is not whether you should be using tools like ChatGPT in moments of emotional difficulty – many people already do, and will continue to do so – but what this ‘new normal’ reveals about how we seek support in the modern day and where the limitations begin to impact wellbeing.

Before the release of ChatGPT in 2022, the notion of confiding in a machine for emotional support existed mainly in sci-fi movies and dark fiction. The dangers of interacting with a non-human entity that can mimic tone, feign empathy, and offer advice are familiar to most with access to Netflix. And yet, ChatGPT receives over 2.6 billion messages per day as of late 2025. Our understanding of communication – the personal expression, the give and take required to build trust – has long been grounded in physical exchanges between friends, family, doctor and patient. But for a growing number of us, conversations with AI are now a feature of everyday life.

Why are we turning to AI for mental health support?

If you are struggling with your mental health, the appeal of opening up to AI is probably not driven by a conscious belief that machines can replace the care of your loved ones. It seems more likely that emotional reliance on AI is caused by societal conditions: many people are time-poor, experience global uncertainty, information gaps, and inaccessible or ineffective mental health support. According to a 2025 report by leading UK charity Rethink Mental Illness, people were eight times more likely to wait at least 18 months for mental health treatment than physical health treatment – with 83% of respondents stating that their mental health worsened during the wait period. Under these conditions, an always-available, free, non-judgemental conversational tool is a compelling alternative and can provide real comfort – especially for those in crisis.

Chatbots can help to introduce emotional language and offer structure where there is mental noise, helping with thought organisation and promoting better introspective practices. After all, having an ordered system of thoughts makes processing easier. When chatbots are used in this way, there is a resemblance to forms of guided reflection which sits closer to meditative practices than to therapy itself. In this way, it could be argued that using ChatGPT becomes a beneficial practice when used alongside clinically backed treatment, similar to methods of self-regulation that have been observed for centuries.

AI-powered chatbots respond to you immediately. They don’t interrupt, rush or signal discomfort. They can’t socially misunderstand you or grow fatigued by circular conversations – even if used all day. If you struggle to articulate what you’re experiencing or worry you are a burden, this can provide some relief. Studies suggest that people with moderate anxiety symptoms experienced a significant reduction in distress after using ChatGPT powered chatbots for just seven days, with improvements exceeding 20% in two separate phases. It’s important to make the distinction that the benefit here is not necessarily in depth or long-term transformation, but availability in unpredictable moments of need. At The London Psychiatry Centre (TLPC), we offer effective treatment for anxiety disorders from diagnosis and evaluation through to post-treatment maintenance, to ensure sustainable recovery in a way that works for you and has a meaningful impact on your life.

Why do people feel safer talking to AI than therapists?

It’s easy to understand the mass appeal of AI chatbots. After all, messaging with ChatGPT demands a lower level of vulnerability than face-to-face interactions with your doctor – someone who may challenge your self-image and become liable in some way for your recovery. People we confide in often become mirrors to us, revealing to us our patterns of thought and how we process lived experience – sometimes simply by virtue of existing as someone who can empathise with our feelings in a way that feels relatable. Mental health professionals would be negligent if they avoided difficult conversations simply to maintain a smooth relationship with you, being legally bound to act in your best interests.

So, whilst ChatGPT can help with emotional processing in the short-term, it is unlikely to strike the uncomfortable depth required for long-term change.

ChatGPT will not reliably push you to introspect deeply, confront your limiting beliefs or take accountability for damaging lifestyle choices, unless you ask directly. For this reason, it cannot totally replace the function of a clinician. Studies have explored an interesting phenomenon present across AI assistants and chatbots known as ‘sycophancy’ – the tendency of AI to ‘model responses that match user beliefs over truthful ones’. Where the responses of AI ‘give predictably biased feedback, and mimic errors made by the user’, chatbots can quietly enable poor decision making and reinforce self-sabotaging behaviours.

For conditions such as bipolar spectrum disorders, characterised by both manic and depressive symptoms – ranging from of mood to mania or mixed states to severe depression – this can be extremely harmful. To treat bipolar spectrum disorders, TLPC offers the The Zamar Protocol® (Precision guided high-dose thyroid hormones and rTMS protocol). Pioneered by our founder, Dr Andy Zamar, The Zamar Protocol® remains the only treatment available that achieves effective remission in subthreshold bipolar disorder — one of the most prevalent and underdiagnosed yet severe forms across the bipolar spectrum.

Can ChatGPT replace a therapist?

With this in mind, the consideration of complex conditions is key – such as some types of trauma and PTSD, for which the gold standard line of treatment recommended by the WHO includes Eye Movement Desensitisation and Reprocessing (EMDR). EMDR involves reprocessing the emotional, physical and cognitive components of the brain’s memory centre – whereas ChatGPT provides only narrative processing. And if you suffer with a comorbidity, i.e., two conditions at once, things become even trickier and diagnostic criteria apply differently. For example, for those with both severe depression and PTSD, the NICE guidelines indicate that the depression should be treated first – a complexity that chatbots will unlikely flag, nor address. For the treatment of PTSD using EMDR, our specialists use Neurotek technology, with the Care Quality Commission reporting that even complex cases treated at TLPC became symptom free within one to three sessions, whilst normally eight to twelve sessions are needed to achieve this result.

In an age where AI is increasingly incorporated into daily life, learning how to properly utilise it and understanding how it responds to prompts is a valuable skill – one that you can only gain through use. But the risks of turning to ChatGPT for mental health support don’t lie in experimentation – they lie in clean substitution. At most, AI-powered chatbots should be used as a supplementary tool. Therapy is not simply the exchange of language or insight. It represents a relational process shaped by attunement, repair, and self-responsibility over time. AI can respond, but a clinician can intervene. AI chatbots simply cannot replace the experience and expertise of a trained clinician. Appropriate, evidence-based treatment for many people entails methods above and beyond talking therapies. These can include psychotherapy, neuromodulation treatments like repetitive Transcranial Magnetic Stimulation (rTMS), or the prescription of medication. TLPC pioneered rTMS in the UK in 2011, reporting outcomes for depression three times more effective than the UK platinum standard, with full remission rates of 66% and 61% and high patient satisfaction.

Can you become emotionally dependant on ChatGPT?

Data on ChatGPT suggests that approximately 560,000 users per week demonstrate signs of mental health crisis including psychosis or mania, and over a million users per week engage in conversations that explicitly indicate suicidal ideation or intent. The danger here is that AI cannot recognise when you are minimising your suffering, nor can it act if safety becomes a concern. These gaps do not make AI inherently dangerous, but they do make it an unsuitable alternative to professional mental health care, particularly if you are experiencing high-risk symptoms.

According to OpenAI, over 1 million users per week show signs of emotional dependence on ChatGPT. When support becomes endlessly available, frictionless, and private, it can subtly replace the more vulnerable act of seeking help elsewhere. In this sense, reliance on chatbots for support may not present as dramatic, largely because your friends and family may be unable to detect your suffering. Reliance on AI-powered chatbots can postpone vulnerability whilst simultaneously reinforcing isolation. In the long run, this has the potential to damage personal relationships amongst friends, families and wider support systems. AI can mirror humanity, but it can’t notice what you withhold or avoid. It can’t look into your eyes and recognise pain, nor register the weight of silence, tone shifts, or the cumulative behavioural patterns that give context to your distress.

Is there a safe way to use ChatGPT for mental health?

If you already use AI as a way to think, reflect on personal problems, or stabilise yourself, professional support does not need to replace that entirely. For mild cases of discomfort, the short-term relief of using chatbots to soothe overthinking and minor anxieties can be helpful. However, it’s important to remain self-aware so that you can identify high risk symptoms that could require intervention in yourself or loved ones. These symptoms may include thoughts of suicide or self-injury, severe mood swings, or an inability to take care of your basic personal needs. Seeking professional support in these instances can ensure that steps are taken to protect your long-term wellbeing and can potentially save lives.

The widespread use of ChatGPT for emotional support can feel taboo and shameful to admit, however, open conversations with medical professionals about the felt benefits can draw attention to issues in traditional treatment. This helps to pave the way to meaningful recovery. True self-understanding and lifestyle change unfolds through real relationships and appropriate diagnosis according to clinical evidence and guidelines, alongside other emotional outlets. When distress is persistent and escalating, the presence of a professional who can recognise and mitigate risk whilst offering a variety of treatment options becomes essential. Besides, chatbots designed for general use were never intended to carry precise diagnostic capacity.

If you are struggling with your mental wellbeing or experiencing any symptoms associated with mental health conditions, the experts at TLPC are here to deliver support. Contact our team at +44 20 7580 4224 or info@psychiatrycentre.co.uk for more information or to book a consultation.

The London Psychiatry Centre subscribes to the ISCAS Code.

View certificate

We are now CPD accredited

View certificate

Call Now Button