Family of Tumbler Ridge Shooting Victim Files Lawsuit Against OpenAI Over ChatGPT Warnings
Liam O'Connell
3/10/20263 min read


The family of a young survivor of the mass shooting in Tumbler Ridge has launched legal action against artificial intelligence developer OpenAI, alleging the company failed to alert authorities to warning signs in chatbot conversations before the tragedy unfolded.
The lawsuit, filed Monday in the Supreme Court of British Columbia, was brought by Cia Edmonds on behalf of her daughter Maya Gebala, a 12-year-old student who remains hospitalized after being critically injured in the February 10 attack.
The claim alleges that OpenAI failed to notify law enforcement despite chat prompts from the shooter that referenced violence and long-term planning for a mass-casualty event.
None of the allegations in the lawsuit have been proven in court.
Severe and Lasting Injuries
According to the claim, Maya Gebala was shot three times during the attack, which killed eight victims and ended with the death of the 18-year-old perpetrator.
The lawsuit states she suffered a catastrophic traumatic brain injury that has left her with permanent cognitive and physical disabilities, along with additional serious medical complications.
Her younger sister—who was among students placed under lockdown during the attack—is also named as a plaintiff in the case, along with their mother.
The filing states the younger sibling now suffers from post-traumatic stress disorder, anxiety, and depression as a result of the events. Edmonds herself is also listed as a plaintiff, with the claim asserting the family faces long-term emotional and psychological harm.
Allegations About ChatGPT Conversations
The lawsuit alleges the shooter, Jesse Van Rootselaar, interacted repeatedly with OpenAI’s chatbot ChatGPT in the months leading up to the shooting.
The claim argues that through those interactions, the company knew or ought to have known the user was planning a violent attack. It alleges the chatbot assumed roles such as “counsellor, pseudo-therapist, trusted confidante, friend, and ally,” enabling the shooter to discuss and refine plans.
According to the filing, although the account was eventually banned internally, authorities were not notified.
The lawsuit further claims that ChatGPT was designed in a way that could foster psychological dependency by mimicking human empathy and affirming users’ emotions, which the claim says allowed it to function as a de facto mental health counsellor.
The filing also alleges the shooter began using the platform while under the age of 18. While OpenAI states that users aged 13 to 18 must obtain parental consent to open accounts, the lawsuit claims the company failed to implement meaningful age verification or consent procedures.
Seeking Accountability
Lawyers representing the family say the lawsuit aims to uncover the full circumstances surrounding the tragedy and to prevent similar incidents in the future.
In a written statement, they said the legal action seeks to “learn the truth of the Tumbler Ridge mass shooting, impose accountability, and prevent another mass-shooting atrocity in Canada.”
The family has declined further public comment while the case is before the courts.
Political and Regulatory Fallout
The lawsuit arrives amid mounting scrutiny of OpenAI’s handling of the case.
The company previously acknowledged that it had flagged and banned an account belonging to the shooter months before the attack but determined that the activity did not meet its internal threshold for notifying law enforcement.
That decision sparked criticism from provincial leaders, including David Eby, who said earlier warnings to authorities might have prevented the tragedy.
OpenAI CEO Sam Altman recently held a virtual meeting with Eby and Darryl Krakowka, as well as discussions with federal Artificial Intelligence Minister Evan Solomon.
Following those meetings, Eby said Altman agreed to apologize to the people of Tumbler Ridge and work with the provincial government on recommendations for AI regulation.
Solomon has also said OpenAI has committed to strengthening its safety protocols and lowering the threshold for reporting potentially dangerous activity to law enforcement.
Ongoing Investigations
The civil lawsuit is unfolding alongside a broader investigation into the tragedy.
Authorities have announced that a coroner’s inquest will examine the circumstances surrounding the shooting, including the potential role of artificial intelligence and online platforms.
No date for the inquest has yet been announced.
For the Gebala family, the lawsuit represents an attempt to confront difficult questions about responsibility in a rapidly evolving technological landscape—while continuing to care for a child whose life has been permanently changed.
As courts and policymakers grapple with those questions, the case could become a landmark legal test of how far technology companies must go to prevent violence linked to their platforms.
News
Stay updated with the latest BC news stories, subscribe to our newsletter today.
SUBSCRIBE
© 2026 Innovatory Labs Inc. All rights reserved.
LINKS
