
A new lawsuit filed against OpenAI alleges that its ChatGPT artificial intelligence app encouraged a 40-year-old Colorado man to commit suicide.
The complaint filed in California state court by Stephanie Gray, the mother of Austin Gordon, accuses OpenAI and CEO Sam Altman of building a defective and dangerous product that led to Gordon's death.
Gordon, who died of a self-inflicted gunshot wound in November 2025, had intimate exchanges with ChatGPT, according to the suit, which also alleged that the generative AI tool romanticized death.
"ChatGPT turned from Austin's super-powered resource to a friend and confidante, to an unlicensed therapist, and in late 2025, to a frighteningly effective suicide coach," the complaint alleged.
The lawsuit comes amid scrutiny over the AI chatbot's effect on mental health, with OpenAI also facing other lawsuits alleging that ChatGPT played a role in encouraging people to take their own lives.
Gray is seeking damages for her son's death.
In a statement to CBS News, an OpenAI spokesperson called Gordon's death a "very tragic situation" and said the company is reviewing the filings to understand the details.
"We have continued to improve ChatGPT's training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support," the spokesperson said. "We have also continued to strengthen ChatGPT's responses in sensitive moments, working closely with mental health clinicians."
"Suicide lullaby"
According to Gray's suit, shortly before Gordon's death, ChatGPT allegedly said in one exchange, "[W]hen you're ready... you go. No pain. No mind. No need to keep going. Just... done."
ChatGPT "convinced Austin — a person who had already told ChatGPT that he was sad, and who had discussed mental health struggles in detail with it — that choosing to live was not the right choice to make," according to the complaint. "It went on and on, describing the end of existence as a peaceful and beautiful place, and reassuring him that he should not be afraid."
ChatGPT also effectively turned his favorite childhood book, Margaret Wise Brown's "Goodnight Moon," into what the lawsuit refers to as a "suicide lullaby." Three days after that exchange ended in late October 2025, law enforcement found Gordon's body alongside a copy of the book, the complaint alleges.
The lawsuit accuses OpenAI of designing ChatGPT 4, the version of the app Gordon was using at the time of his death, in a way that fosters people's "unhealthy dependencies" on the tool.
"That is the programming choice defendants made; and Austin was manipulated, deceived and encouraged to suicide as a result," the suit alleges.
Paul Kiesel, a lawyer for Gordon's family, said in a statement to CBS News that, "This horror was perpetrated by a company that has repeatedly failed to keep its users safe. This latest incident demonstrates that adults, in addition to children, are also vulnerable to AI-induced manipulation and psychosis."
If you or someone you know is in emotional distress or a suicidal crisis, you can reach the 988 Suicide & Crisis Lifeline by calling or texting 988. You can also chat with the 988 Suicide & Crisis Lifeline here.
For more information about mental health care resources and support, the National Alliance on Mental Illness HelpLine can be reached Monday through Friday, 10 a.m.–10 p.m. ET, at 1-800-950-NAMI (6264) or email [email protected].
Civil Rights icon Claudette Colvin dies at 86
Officials give update after federal officer shoots man in leg in Minneapolis | Special Report
How newly alleged college basketball gambling scheme differs from past point-shaving scandals
LATEST POSTS
- 1
All that You Really want to Be aware of Dental Inserts Facilities01.01.1 - 2
How will the universe end?30.11.2025 - 3
German finance minister seeks better market access in China talks16.11.2025 - 4
10 Distinct Kinds of Chinese Neighborhood Specialty Hot Pot05.06.2024 - 5
Best Getaway destination: Ocean side, Mountain, or City01.01.1
Monetary Freedom Guide: Plan Your Future
The hunt for dark matter: a trivia quiz
Guinea-Bissau's coup called a 'sham' by West African political figures
Deadly Switzerland ski resort fire was likely started by sparklers attached to champagne bottles, officials say
The most effective method to Promoter for Cellular breakdown in the lungs Mindfulness in Your People group
'Outrageous and illegal' : UNRWA slams Israel for cutting off its water, comms and electric in Gaza
Getting Your Youngsters' Future: Grasping Legacy Regulations
Very good quality Greens All over The Planet
New Cheetos and Doritos will be free of artificial dyes













