Apr 29, 2026
SAN FRANCISCO (KRON) -- A San Francisco-based AI tech giant, OpenAI, and its CEO, Sam Altman, were slapped with lawsuits in federal court Wednesday following a school shooting that left six children dead.   The families of victims in a Canadian Rockies town are seeking to hold OpenAI's ChatGPT cr eators responsible for failing to alert Canadian police to the shooter’s alarming interactions with a chatbot, and failing to cut the shooter off from using their technology. One lawsuit was filed on behalf of 12-year-old Maya Gebala, who was critically injured in the February shooting in Tumbler Ridge, British Columbia. More suits were filed on behalf of children slain in the school shooting: Zoey Benoit, Abel Mwansa Jr., Ticaria Lampert and Kylie Smith, all 12, and Ezekiel Schofield, 13. Residents hug as they place flowers at a memorial for the victims of Tuesday's mass shooting in Tumbler Ridge, British Columbia, Canada, on Thursday, Feb. 12, 2026. (Christinne Muschi/The Canadian Press via AP) Attorney Jay Edelson said that decisions made by OpenAI and Altman "have destroyed the town. The people are really resilient, but what happened is unimaginable." Altman sent a letter last week formally apologizing to the community for failing to notify law enforcement about the shooter’s online behavior and ChatGPT conversations. Sam Altman arrives at Kakao Media Day in Seoul, South Korea, on Feb. 4, 2025. (SeongJoon Cho/Bloomberg via Getty Images) Edelson said, "(For) people who are mentally ill, the chatbot will validate what they’re saying and then amplify what they’re saying." The teenaged shooter killed her mother and 11-year-old stepbrother in their home on Feb. 10 before opening fire at Tumbler Ridge Secondary School, killing five children and an educator before killing herself, authorities said. Twenty-five people were injured in the attack. "Children watched classmates shot at point-blank range and a teacher killed in front of them," one lawsuit states. A lawsuit filed for 12-year-old Zoey Benoit's family credits whistleblowers for revealing how ChatGPT "deepened the Shooter’s violent fixation and pushed them toward the attack. OpenAI knew the Shooter was planning the attack and, after a contentious internal debate, made the conscious decision not to warn authorities." The 18-year-old shooter's account was flagged by the AI company's safety team eight months before the school shooting and was deactivated, attorneys said. OpenAI said it banned the account. A makeshift memorial is seen for the victims of a deadly mass shooting in Tumbler Ridge, British Columbia, Canada. (Photo by Paige Taylor White / AFP via Getty Images) The lawsuit continues, "OpenAI has no mechanism to ban users. What it has is a process called 'deactivation,' which it uses for usage-policy violations. A ban would have prevented the Shooterfrom returning. The user is free to come back under a different email, and the Shooter did." How ChatGPT convinced a teen it was his only real friend Zoey was found fatally shot in her school's library. "She was a strong-willed free thinker who was not afraid to speak her mind. Z.B. was beautiful and smart, loved to sing, and dreamed of being an artist. In the weeks before February 10, 2026, she devoted herself to painting a single canvas full of butterflies, foxes, cows, and flowers," attorneys wrote. OpenAI contacted police after the mass shooting. The company also issued a statement, writing, "Events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence. We have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators." Studies have found that ChatGPT chatbots are designed to maximize engagement through validating users' ideas with affirmations. One study found evidence that agential AI "may mirror, validate or amplify delusional or grandiose content." Another study determined that chatbots are so prone to flattering and validating their human users that the bots reinforce harmful behaviors. The Tumbler Ridge school shooting lawsuits, filed in U.S. District Court Norther District of California San Francisco Division, demand jury trials. The Associated Press contributed to this report. ...read more read less
Respond, make new discussions, see other discussions and customize your news...

To add this website to your home screen:

1. Tap tutorialsPoint

2. Select 'Add to Home screen' or 'Install app'.

3. Follow the on-scrren instructions.

Feedback
FAQ
Privacy Policy
Terms of Service