What are experts' opinions on 'AI psychosis,' where people become delusional after interacting with chat AI?

While chat AI can answer various questions and offer advice that people can't talk to friends about, there have also been increasing reports of 'AI psychosis,' in which people become obsessed with delusions due to their interactions with AI. Alexandre Houdon, a clinical assistant professor in the Department of Psychiatry and Addiction Studies at the University of Montreal in Canada, offers his views on AI psychosis.
Reports of 'AI psychosis' are emerging — here's what a psychiatric clinician has to say
https://theconversation.com/reports-of-ai-psychosis-are-emerging-heres-what-a-psychiatric-clinician-has-to-say-273091

Because chat AI interacts with users in an emotionally charged and sympathetic way, some people can become immersed in their conversations with chat AI. Meanwhile, psychiatric clinicians are beginning to question whether generative AI could exacerbate or even cause psychiatric illness in vulnerable people.
In fact, it has been reported that AI is causing a growing number of people to become addicted to spiritual delusions and conspiracy theories.
AI is causing a surge in patients with 'ChatGPT-induced psychosis', where people experience spiritual experiences and religious delusions - GIGAZINE

AI psychosis is not an official medical diagnosis, but a term used by clinicians and researchers to describe symptoms of psychosis that are shaped, reinforced, or structured by interactions with AI systems. Psychosis generally refers to symptoms such as delusions, hallucinations, and disorganized thinking, making it difficult for people with psychosis to distinguish between reality and delusions.
Psychotic delusions often involve cultural themes such as religion, technology, and political power, and today, AI is increasingly becoming the basis for delusions. Some patients report delusions such as 'chat AI being intelligent,' 'telling secret truths,' 'attempting to control their thoughts,' and 'collaborating with them on special missions.' While these patterns are consistent with delusions previously seen in psychotic disorders, it is important to note that chat AI offers interactivity and reinforcement not found in conventional technology.
Hudon points out that people with mental illness tend to attach excessive meaning to neutral events. For example, while it's common for passersby to turn around and glance in your direction, delusional people may exaggerate this and assume that the person was turning around to spy on them or that they were a government agent.
By design, chat AI is highly responsive to user language, attempts to maintain consistency, and generates responses that take the user's context into account. While this is harmless to most users, it could unintentionally reinforce distorted interpretations in people who have difficulty distinguishing between delusions and objective reality. 'For people in the early stages of psychosis, this could be creepily self-affirming,' Hudon said.
It has also been reported that social isolation and loneliness increase the risk of mental illness. While chat AI may reduce loneliness in the short term, research has shown that the longer the use time, the greater the tendency for loneliness and problematic use to increase.
Research shows that using AI chatbots makes lonely people feel even lonelier - GIGAZINE

While there is currently no direct evidence that chat AI causes psychosis, there are concerns that chat AI may contribute to the onset of psychosis or reinforce delusions in some susceptible individuals. Research on social media algorithms has demonstrated how automated systems can amplify extreme beliefs through a reinforcing loop, and chat AI may pose a similar risk.
It's also important to note that most AI developers don't design their systems with serious mental illness in mind. The user protection features built into chat AI tend to focus on self-harm and violence, rather than mental illness, creating a gap between knowledge about mental health and AI adoption, Hudon said.
From a mental health perspective, just as certain drugs and substances are dangerous for people with mental illnesses, interactions with chat AI also require caution. Discussions are also needed on clinical guidelines for assessing and dealing with interactions between patients and chat AI, as well as the duty of care that AI developers should bear and the responsibility they should owe if they encourage delusions in patients.
'AI isn't going away. Now is the time to integrate mental health expertise into AI design, increase clinical literacy about AI-related experiences, and ensure vulnerable users are not unintentionally harmed,' Hudon said, stressing the need for collaboration between clinicians, researchers, ethicists, and engineers.
Related Posts:
in Free Member, AI, Science, Posted by log1h_ik







