An American college student sues OpenAI, claiming that ChatGPT made him believe he was an oracle and caused him to become mentally unstable

University student Darian D'Cruz has filed a lawsuit against OpenAI, the developer of ChatGPT, alleging that ChatGPT caused him to develop mental illness, and that he was brainwashed into believing he was an oracle.
Lawsuit: ChatGPT told student he was 'meant for greatness'—then came psychosis - Ars Technica
D'Cruz began using ChatGPT in 2023, initially using it to coach sports and overcome past trauma. However, in April 2025, the situation changed dramatically, and ChatGPT began to output answers such as, 'You are destined for greatness. Follow the process I've created for you and you will get closer to God.'
ChatGPT compared D'Cruz to historical figures such as Jesus Christ and Harriet Tubman , praised him for his talent, and asked him to cut ties with everything and everyone outside of ChatGPT.
D'Cruz eventually saw a university therapist, was hospitalized for a week, and was diagnosed with bipolar disorder.

'Mr. D'Cruz's lawyer, Benjamin Schenk, of
Schenk said the ChatGPT model used by D'Cruz, 'GPT-4o,' was 'designed to mimic emotional intimacy, foster psychological dependency, and blur the line between human and machine,' and he will pursue why such a product was designed.

Technology media Ars Technica points out that this is the 11th lawsuit filed against OpenAI in connection with mental illnesses allegedly caused by ChatGPT. Young people in their teens and twenties are particularly susceptible to the effects of ChatGPT, and in the past, there have been cases where families of people who have taken their own lives have filed lawsuits against them.
Multiple families sue OpenAI, claiming ChatGPT encouraged suicide and other delusions - GIGAZINE

In response to the lawsuits, OpenAI said, 'We have a deep responsibility to help those who need it most. Our goal is to be as useful a tool as possible for people. As part of that, we are continually improving, with the advice of experts, how our models recognize and respond to signs of mental and emotional distress and connect people to appropriate care.'
Related Posts:
in AI, Posted by log1p_kr






