If you’re like most therapists, you may be wondering how AI will affect your therapy practice. But you should also consider how AI could affect your clients.
“AI psychosis” is the buzzword of the day, popping up everywhere from the tabloid press to peer-reviewed journals. It’s a broad term, referring to delusional thought patterns and behavior that develop in tandem with AI use.
You may already be hearing from clients who turn to AI chatbots to discuss their personal problems. But at what point does AI stop being a helpful tool and become a problem?
What is AI psychosis?
AI psychosis is not a condition recognized by the DSM or any other authority on psychotherapy. It’s a blanket term encompassing a number of seemingly related symptoms manifesting in individuals who make heavy use of AI chatbots.
In some cases, these individuals have already been diagnosed with schizophrenia, delusional disorder, and/or other conditions. In other cases, they have no documented histories of psychological distress.
AI psychosis typically takes one of three forms, each of which can be categorized by the type of delusion:
- Messianic missions: The individual believes that, with AI’s help, they have discovered hidden truths about the world—a form of grandiose delusion.
- “Godlike” beliefs: The individual believes that AI has achieved godlike intelligence and knowledge. This is a form of religious or spiritual delusion.
- Romantic or attachment delusions: The individual is convinced that a particular AI chatbot is able to feel and express love—a form of erotomaniac delusion.
The sole factor differentiating AI psychosis from typical forms of delusion is that AI plays a role. Whether this factor has an impact on the diagnosis, treatment, or future status of AI psychosis as a recognized disorder remains to be seen. Advanced AI only became publicly available in 2022; it’s a developing field of inquiry.
Is AI psychosis real?
In one sense, AI psychosis may just be another buzzword that the media has latched onto, an attention-grabbing way of describing garden-variety delusions that happen to involve AI.
In another sense, AI psychosis is a new phenomenon. AI chatbots convincingly behave like human agents. There have been relatively few studies either of how they affect individuals experiencing schizophrenia or delusional disorders, or how they affect people without a history of psychological conditions.
A patient whose delusion centers on perceived hidden messages in the newspaper is navigating a different world than one whose beliefs are echoed and reinforced by a chatbot who—by all appearances—is an independent, intelligent, disembodied entity.
So it’s worthwhile calling out the unique effects—effects that are still not well understood—that AI has on individuals.
{{resource}}
Signs of AI psychosis
In the news, AI psychosis usually appears in stories with violent and tragic elements: Suicide and self-harm, fatal confrontations with law enforcement, and criminal acts.
But in day-to-day life, AI psychosis—or negative thoughts or behaviors impacted by AI, but which don’t quite fit that label—manifests in less lurid forms.
An individual who has developed an unhealthy relationship with AI may:
- Undergo apparent personality changes
- Begin subscribing to bizarre or self-contradictory beliefs
- Suddenly cut off ties with friends or family
- Pass up in-person social occasions in favor of using AI
- Refer to an AI chatbot in conversation as though they were a real person
- Neglect work or family duties to spend more time using AI
- React with violent emotion when their AI use is questioned or criticized
- Claim to have undergone a spiritual awakening with help from AI
- Believe they can detect hidden patterns in the world based on information AI has given them
Because AI psychosis is not a recognized condition, there are no set criteria for diagnosing it. But if you are a therapist treating a client who uses chatbots, opening a dialogue about their AI use is the first step to assessing how it may or may not be negatively impacting their life.
How to Treat AI Psychosis
In cases that fit the description of AI psychosis, therapists should follow their training in treating delusional disorder and seek outside help as needed.
That being said, AI does introduce unique challenges when it becomes central to a client’s delusions. To support your treatment, there are some extra steps you can take:
- Educate yourself on AI: Understanding how AI works prepares you to help a client challenge their delusions, and following the latest research into AI delusions keeps you on top of new developments.
- Find support groups: Groups like the Human Line Project can connect your client with others facing similar struggles, while also serving as a research tool to help you learn more about the experiences of individuals with AI-impacted mental health conditions.
- Connect with other therapists: In-person and online communities for therapists can help you find support and stay up to date with research developments and treatment methods.
—
Learn more about How AI Will Impact Therapists.
This post is to be used for informational purposes only and does not constitute legal, business, or tax advice. Each person should consult their own attorney, business advisor, or tax advisor with respect to matters referenced in this post.
Bryce Warnes is a West Coast writer specializing in small business finances.
{{cta}}
Manage your bookkeeping, taxes, and payroll—all in one place.

Discover more. Get our newsletter.
Get free articles, guides, and tools developed by our experts to help you understand and manage your private practice finances.





