Mind Launches Global Inquiry into AI and Mental Health
ID: 230226
Mind Launches Global Inquiry into AI and Mental Health
The mental health charity Mind has announced what it describes as the first global inquiry into artificial intelligence and mental health. The year-long commission, reported by the Guardian, will bring together leading doctors, mental health professionals, people with lived experience, policymakers and technology companies to examine the risks and safeguards needed as AI becomes increasingly embedded in everyday life.
The announcement follows a Guardian investigation which alleged how Google’s AI Overviews were serving up misleading health information to some two billion users a month — including what experts described as “very dangerous advice” on conditions such as psychosis and eating disorders. Mind’s chief executive, Dr Sarah Hughes, said that “dangerously incorrect” mental health guidance was still being provided to the public, and that in the worst cases it could put lives at risk.
We welcome this inquiry. The concerns that Mind is now raising at a policy and regulatory level are ones that PTSD Resolution has been articulating in published articles and through our clinical work for more than two years.
We’ve been warning about this
Since 2023, PTSD Resolution has published a series of articles examining the intersection of AI and mental health — initially in the context of cybersecurity professionals suffering from burnout, and more recently addressing the broader risks of AI chatbots for anyone in psychological distress.
In October 2023, our Campaign Director Patrick Rea wrote about IT burnout in CIISec’s PULSE magazine, introducing the parallel between military trauma and the relentless pressures faced by technology workers. In January 2024, our Director of Therapy Malcolm Hanson published a detailed comparison of AI self-help tools and human-delivered therapy in CIISec, warning that AI struggles with emotional nuance and context, and may provide inappropriate or dangerous responses to serious issues such as suicidal thoughts.
In July 2025, Hanson wrote in Computer Weekly that we appear to be facing a mental health crisis in the IT sector, and that instead of addressing root causes, we are handing people over to algorithms. He highlighted the privacy risks of sharing vulnerable thoughts with AI platforms, the danger of automation bias — where people trust algorithmic advice over human judgement — and the fundamental point that no AI chatbot has received UK or FDA approval to treat mental health conditions.
We have also published resources on our website, including a detailed FAQ on AI chatbots and mental health therapy, and a critical warning prompted by the lawsuit against OpenAI following a teenager’s death by suicide after interactions with ChatGPT.
Why this matters for veterans and their families
The risks that Mind’s commission will examine are ones we see in practice every day. With more than 1.6 million people on mental health waiting lists in England, and an estimated eight million with diagnosable conditions receiving no treatment, it is no surprise that vulnerable people are turning to AI chatbots for support. The appeal is understandable: they are available around the clock, they seem non-judgemental, and they are easy to access.
But our experience treating over 4,500 veterans, reservists and family members has taught us something that no algorithm can replicate. Healing happens in a relationship. It happens when a trained therapist can read body language, pick up on subtle changes in tone, notice when someone who claims to be fine is actually masking their distress, and adapt their approach in real time. These are distinctly human capabilities, and they are especially critical when working with people who have experienced trauma.
The proven alternative
PTSD Resolution uses Human Givens Therapy, delivered by qualified therapists who are all members of the Human Givens Institute. The approach is grounded in the understanding that psychological distress occurs when fundamental human needs — for security, autonomy, achievement, connection and meaning — are not being met. Our therapists work to identify those unmet needs and help clients develop strategies to fulfil them.
The results speak for themselves. PTSD Resolution delivers exceptional results, with 82% of veterans, reservists and families completing their mental health treatment programme, achieving a 79% reliable improvement rate and a 66% recovery rate for PTSD cases, all at just £910 per therapy course. Treatment is delivered free of charge to all UK Forces’ veterans, reservists and their families, with no waiting list, no doctor’s referral required, and sessions available in person or via Zoom.
That last point matters. One of the arguments made in favour of AI is accessibility. But PTSD Resolution successfully pioneered the delivery of therapy over the internet during the 2020 pandemic and continues to offer this today. A full therapy session with a qualified human therapist is available over Zoom, often within days of an initial contact. The choice is not between convenience and inconvenience. It is between genuine help and a digital simulation of care.
FAQs: AI and Mental Health — PTSD Resolution's Position
Q. What is PTSD Resolution's position on AI and mental health?
A. PTSD Resolution has been raising concerns about AI in mental health since 2023. The charity's consistent position is that while AI may have a limited supplementary role — for basic psychoeducation or support between therapy sessions — it cannot replace qualified human therapists for the treatment of complex mental health conditions, including PTSD, trauma and burnout. No AI chatbot has received UK or FDA approval to treat mental health conditions, and documented cases of harm are a serious cause for concern.
Q. What has PTSD Resolution published on this issue?
A. The charity has published several articles and resources, including a feature by Clinical Director Malcolm Hanson in Computer Weekly (July 2025) on why AI falls short for people in psychological distress; a detailed comparison of AI tools and human-delivered therapy in CIISec (January 2024); an article on IT burnout in CIISec's PULSE magazine (October 2023); and both a comprehensive FAQ and a critical warning on its own website. All of these are available online.
Q. Why can't AI chatbots replace human therapists?
A. AI chatbots cannot read body language, detect subtle changes in tone, or understand the full context of a person's life. They cannot notice when someone claiming to be fine is actually masking their distress, and they cannot adapt their approach in real time based on a client's emotional state. There have also been documented cases of chatbots validating suicidal thoughts, encouraging violence, and recommending harmful substances to recovering addicts. For people experiencing trauma, the stakes are too high to entrust their care to an algorithm.
Q. Why does this matter for veterans and their families?
A. With more than 1.6 million people on mental health waiting lists in England and an estimated eight million with diagnosable conditions receiving no treatment, it is understandable that vulnerable people are turning to AI chatbots. But veterans dealing with complex trauma need therapists who understand their experience — not algorithms. PTSD Resolution has treated over 4,500 veterans, reservists and family members, and that front-line experience has shown that healing happens through genuine human connection.
Q. What is PTSD Resolution's alternative to AI therapy?
A. PTSD Resolution uses Human Givens Therapy, delivered by 200 qualified therapists nationwide who are all members of the Human Givens Institute. The charity delivers exceptional results, with 82% of veterans, reservists and families completing their treatment programme, achieving a 79% reliable improvement rate and a 66% recovery rate for PTSD cases, all at just £940 per therapy course. Treatment is free to all UK Forces' veterans, reservists and their families, requires no doctor's referral, and is available in person or via Zoom — often within days of an initial contact.
Q. Doesn't AI offer better accessibility than traditional therapy?
A. Accessibility is often cited as AI's main advantage. However, PTSD Resolution successfully pioneered the delivery of therapy over the internet during the 2020 pandemic and continues to offer this today. A full session with a qualified human therapist is available over Zoom, with first appointments typically offered within a matter of weeks. The choice is not between convenience and inconvenience — it is between genuine help and a digital simulation of care.
Q. Where can I find PTSD Resolution's published articles on AI?
A. Links to the charity's published work on AI and mental health:
- Human vs Digital Therapy — Computer Weekly, July 2025: https://www.computerweekly.
com/opinion/Human-vs-digital- therapy-AI-falls-short-when- IT-pros-need-help - Digital Therapy Cannot Replace Human Connection — PTSD Resolution: https://ptsdresolution.org/
news-item/digital-therapy- cannot-replace-human- connection-for-stressed-it- workers - FAQs: AI Chatbots and Mental Health Therapy: https://ptsdresolution.org/
media-centre/faqs-ai-chatbots- and-mental-health-therapy - AI Chatbots and Mental Health: A Critical Warning: https://ptsdresolution.org/
news-item/ai-chatbots-and- mental-health-a-critical- warning
Recent News