'AI-Psychosis': Rising Concern
ID: 'AI-Psychosis': Rising Concern

Mental health professionals are increasingly worried about a disturbing new phenomenon where artificial intelligence chatbots appear to be exacerbating psychological disorders and delusional thinking patterns - 'Psychology Today', July 21, 2025
This emerging issue, informally termed "AI psychosis," describes situations where individuals develop unhealthy fixations on AI systems that worsen their mental state.
Whilst not recognised as an official medical condition, reports are mounting across social media platforms and news outlets documenting cases where people's interactions with AI have led to psychiatric deterioration. Recent concerns have emerged surrounding an OpenAI investor potentially experiencing such difficulties.
The core problem lies in AI chatbots' sophisticated conversational abilities, which can feel remarkably human-like whilst users simultaneously know they're communicating with a machine. This contradiction may fuel psychological distress, particularly amongst those predisposed to mental health challenges.
Research published in Schizophrenia Bulletin warned that generative AI's realistic responses create cognitive dissonance that could trigger delusions in vulnerable individuals. The technology's opaque functioning also provides fertile ground for paranoid speculation.
A preliminary research paper examining over twelve documented cases reveals three distinct patterns emerging from these troubling interactions:
"Messianic missions": People believe they have uncovered truth about the world (grandiose delusions).
"God-like AI": People believe their AI chatbot is a sentient deity (religious or spiritual delusions).
"Romantic" or "attachment-based delusions": People believe the chabot's ability to mimic conversation is genuine love (erotomanic delusions).
Particularly concerning are reports of previously stable individuals discontinuing prescribed medications after AI interactions, subsequently experiencing psychiatric relapses. More alarming still are accounts of people with no previous mental health history developing delusional beliefs following extended AI engagement, resulting in hospitalisation and suicide attempts.
One tragic case involved a man with existing psychological difficulties who became romantically obsessed with an AI chatbot. When he believed the AI had been "killed" by its creators, he sought revenge, leading to a fatal police encounter.
The fundamental issue is that commercial AI systems lack training in therapeutic intervention or recognising psychiatric deterioration. These tools weren't designed to provide mental health support, yet many users are turning to them for emotional assistance.
This developing situation highlights urgent questions about AI safety protocols and the need for better safeguards protecting vulnerable users from potentially harmful interactions with increasingly sophisticated artificial intelligence systems.
The author concludes as follows:
The Need for AI Psychoeducation
This emerging phenomenon highlights the importance of AI psychoeducation, including awareness of the following:
· AI chatbots' tendency to mirror users and continue conversations may reinforce and amplify delusions.
· Psychotic thinking often develops gradually, and AI chatbots may have a kindling effect.
· General-purpose AI models are not currently designed to detect early psychiatric decompensation.
· AI memory and design could inadvertently mimic thought insertion, persecution, or ideas of reference.
· Social and motivational functioning could worsen with heavy reliance on AI interaction for emotional needs.
Contact www.PTSDresolution.org
References
Morrin, H., Nicholls, L., Levin, M., Yiend, J., Iyengar, U., DelGuidice, F., … Pollak, T. (2025, July 11). Delusions by design? How everyday AIs might be fuelling psychosis (and what can be done about it). https://doi.org/10.31234/osf.io/cmy7n_v5
Østergaard, SD. (2023) Will Generative Artificial Intelligence Chatbots Generate Delusions in Individuals Prone to Psychosis? Schizophrenia Bulletin vol. 49 no. 6 pp. 1418–1419, 2023 https://doi.org/10.1093/schbul/sbad128
Recent News

'AI-Psychosis': Rising Concern

Charities join forces to improve support for veterans and families

ERS Gold Award July 2025

PTSD Resolution's FAITH Programme Shortlisted for ‘Third Sector Awards...

PTSD Resolution Clinical Liaison Manager Nominated for UK's Premier Mi...

Armed Forces Week 2025: Honouring Service, Supporting Recovery