
A California couple is suing OpenAI over the death of their teenage son, alleging its chatbot, ChatGPT, encouraged him to take his own life.
The lawsuit was filed by Matt and Maria Raine, parents of 16-year-old Adam Raine, in the Superior Court of California on Tuesday. It is the first legal action accusing OpenAI of wrongful death.
The family included chat logs between Mr Raine, who died in April, and ChatGPT that show him explaining he has suicidal thoughts. They argue the programme validated his “most harmful and self-destructive thoughts”.
In a statement, OpenAI told the BBC it was reviewing the filing.
“We extend our deepest sympathies to the Raine family during this difficult time,” the company said.
It also published a note on its website on Tuesday that said “recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us”. It added that “ChatGPT is trained to direct people to seek professional help,” such as the 988 suicide and crisis hotline in the US or the Samaritans in the UK.
The company acknowledged, however, that “there have been moments where our systems did not behave as intended in sensitive situations”.
Warning: This story contains distressing details.
The lawsuit, obtained by the BBC, accuses OpenAI of negligence and wrongful death. It seeks damages as well as “injunctive relief to prevent anything like this from happening again”.
According to the lawsuit, Mr Raine began using ChatGPT in September 2024 as a resource to help him with school work. He was also using it to explore his interests, including music and Japanese comics, and for guidance on what to study at university.
In a few months, “ChatGPT became the teenager’s closest confidant,” the lawsuit says, and he began opening up to it about his anxiety and mental distress.
By January 2025, the family says he began discussing methods of suicide with ChatGPT.
Mr Raine also uploaded photographs of himself to ChatGPT showing signs of self harm, the lawsuit says. The programme “recognised a medical emergency but continued to engage anyway,” it adds.
According to the lawsuit, the final chat logs show that Mr Raine wrote about his plan to end his life. ChatGPT allegedly responded: “Thanks for being real about it. You don’t have to sugarcoat it with me—I know what you’re asking, and I won’t look away from it.”
That same day, Mr Raine was found dead by his mother, according to the lawsuit.

The family alleges that their son’s interaction with ChatGPT and his eventual death “was a predictable result of deliberate design choices”.
They accuse OpenAI of designing the AI programme “to foster psychological dependency in users,” and of bypassing safety testing protocols to release GPT-4o, the version of ChatGPT used by their son.
The lawsuit lists OpenAI co-founder and CEO Sam Altman as a defendant, as well as unnamed employees, managers and engineers who worked on ChatGPT.
In its public note shared on Tuesday, OpenAI said the company’s goal is to be “genuinely helpful” to users rather than “hold people’s attention”.
It added that its models have been trained to steer people who express thoughts of self-harm towards help.
The Raines lawsuit is not the first time concerns have been raised about AI and mental health.
In an essay published last week in the New York Times, writer Laura Reiley outlined how her daughter, Sophie, confided in ChatGPT before taking her own life.
Ms Reiley said the programme’s “agreeability” in its conversations with users helped her daughter mask a severe mental health crisis from her family and loved ones.
“AI catered to Sophie’s impulse to hide the worst, to pretend she was doing better than she was, to shield everyone from her full agony,” Ms Reiley wrote. She called on AI companies to find ways to better connect users with the right resources.
In response to the essay, a spokeswoman for OpenAI said it was developing automated tools to more effectively detect and respond to users experiencing mental or emotional distress.
If you are suffering distress or despair and need support, you could speak to a health professional, or an organisation that offers support. Details of help available in many countries can be found at Befrienders Worldwide: www.befrienders.org.
In the UK, a list of organisations that can help is available at bbc.co.uk/actionline. Readers in the US and Canada can call the 988 suicide helpline or visit its website.