CEO Sam Altman: It keeps me awake at night
- CEO Sam Altman warns as many as 1,500 people a week may be at risk
- Policy shift comes after lawsuit over teens death linked to chatbot
- OpenAI weighs policy change on suicide risk
The company behind ChatGPT could start contacting authorities when young people discuss suicide in conversations with the chatbot, co-founder Sam Altman has said.
In an interview this week, Altman raised fears that as many as 1,500 people globally may be talking about taking their own lives with ChatGPT each week before going on to do so. He admitted the policy change was not yet final but called it very reasonable to alert authorities in cases involving minors where parents could not be reached.
Altmans comments came during a podcast interview with Tucker Carlson, days after he and OpenAI were sued by the family of 16-year-old Adam Raine from California, according to The Guardian.
The lawsuit alleges ChatGPT encouraged the teenager over several months, advising him on whether his chosen method would work and helping him draft a farewell note. Raine died by suicide in April.
Balancing privacy with safety
Altman acknowledged the proposed move would mark a major shift in policy for the San Francisco-based firm, which has more than 700 million global users. User privacy is really important, he said, noting that ChatGPT currently only urges people expressing suicidal thoughts to contact hotlines.
He added it was unclear which authorities could be notified, or what user information OpenAI might share to help locate someone at risk.
Stronger safeguards for teens
Following one highly publicized death, OpenAI said it would introduce parental controls and tougher guardrails around sensitive content and risky behaviours for under-18s.
Altman also suggested restricting people in fragile mental states from gaming the system by pretending they are asking suicide-related questions for research or creative writing. We should say, even if youre trying to write the story or even if youre trying to do medical research, were just not going to answer, he said.
A global crisis
Altman cited figures suggesting 15,000 people die by suicide every week worldwide, which would equate to around 1,500 ChatGPT users based on its share of global population. The World Health Organization estimates more than 720,000 people take their own lives each year.
A spokesperson for OpenAI pointed to recent pledges to improve one-click access to emergency services and connect users to certified therapists before a crisis point.
If you need immediate support:
-
US: 988 Suicide & Crisis Lifeline, call or text 988, or chat at 988lifeline.org
-
UK & Ireland: Samaritans, 116 123 (freephone), This email address is being protected from spambots. You need JavaScript enabled to view it. or This email address is being protected from spambots. You need JavaScript enabled to view it.
-
US: 988 Suicide & Crisis Lifeline, call or text 988, or chat at 988lifeline.org
-
Australia: Lifeline, 13 11 14
-
International: befrienders.org
Posted: 2025-09-11 15:10:11















