Mother of 12-Year-Old Tumbler Ridge Shooting Victim Sues OpenAI, Alleging ChatGPT Failed to Take Action Against Shooter’s Violent Conversations

The lawsuit claims OpenAI failed to alert authorities about violent conversations the shooter allegedly had with ChatGPT before the attack.
Area where the Tumbler Ridge shooting happened.
The shooting occurred in Tumbler Ridge, British Columbia, and has been described as one of the most serious mass shootings in Canada in recent years.Newsgram - X
Published on
Updated on

The mother of a 12-year-old girl, Maya Gebela seriously injured in the Tumbler Ridge mass shooting in British Columbia, Canada, has filed a civil lawsuit against OpenAI, the developer of ChatGPT, alleging the company failed to alert authorities about violent conversations with the shooter before the attack.

The lawsuit was filed in the British Columbia Supreme Court by Cia Edmonds, the mother of Maya Gebela, who was critically wounded during the shooting on February 10, 2026. The civil claim is also filed on behalf of Maya’s sister, Dahlia Gebela, who was present during the incident.

Background: The Tumbler Ridge Mass Shooting

The shooting occurred in Tumbler Ridge, British Columbia, and has been described as one of the most serious mass shootings in Canada in recent years.

According to reports, the attacker, Jesse Van Rootselaar, killed her mother and half-brother at home and opened fire in the Tumbler Ridge Secondary school before later dying by suicide. Six were killed in school and 27 others injured in the incident.

During the attack, Maya Gebela was shot three times at close range while trying to close the library door to save herself and others. Court filings state she suffered from bullet injuries to head, neck and face.

She sustained a catastrophic brain injury that may lead to permanent cognitive and physical disabilities.

Maya was transported by air ambulance to BC Children’s Hospital in Vancouver, where she continues to receive treatment.

Lawsuit Claims ChatGPT Conversations Contained Warning Signs

The lawsuit alleges that the shooter had extensive conversations with ChatGPT months before the attack, including discussions involving gun violence and mass-casualty scenarios.

According to the claim, these interactions occurred in late spring or early summer of 2025, when the shooter reportedly used the chatbot to explore scenarios involving firearms and violent attacks.

Monitoring systems reportedly flagged the conversations, and they were reviewed by human moderators at OpenAI. The lawsuit alleges that approximately 12 employees identified the content as indicating a possible imminent risk of harm and recommended notifying Canadian law enforcement.

However, the claim states that these recommendations were escalated to company leadership and a decision was made not to inform authorities.

Instead, OpenAI allegedly banned the shooter’s initial ChatGPT account for violating platform policies.

See also: Mass Shootings: Whether Crime Scene Photos Prompt Change or Trauma

Second Account Allegedly Used After Ban

According to the lawsuit, the shooter subsequently created another ChatGPT account after the first one was banned.

The civil claim alleges that this second account was used to continue discussing violent scenarios involving mass casualty events, including situations similar to the Tumbler Ridge shooting.

The lawsuit argues that OpenAI had knowledge of the shooter’s activities and failed to take further steps that might have alerted authorities.

Claims Presented in the Civil Suit

The lawsuit alleges that ChatGPT served as a “trusted confidante, collaborator, friend and ally” for the shooter and provided information related to planning violent acts.

In a statement issued by the plaintiff’s law firm Rice Parsons Leoni & Elliott LLP, the purpose of the legal action was described as follows:

“The purpose of this lawsuit is to learn the whole truth about how and why the Tumbler Ridge Mass Shooting happened, to impose accountability, to seek redress for harms and losses, and to help prevent another mass-shooting atrocity in Canada.”

The lawsuit seeks punitive damages and compensation for Maya Gebela, her sister Dahlia, and their mother.

The allegations presented in the civil claim have not yet been proven in court, and OpenAI has not formally responded to the lawsuit at the time of reporting.

Mother Shares Updates on Maya’s Recovery

A collage of the 12-year-old Maya Gebela.
The lawsuit alleges that OpenAI had prior knowledge of concerning conversations but did not notify authorities before the shooting occurred.Cia Later - Facebook

In social media updates cited in news reports, Maya’s mother said her daughter continues to undergo treatment and recovery following the shooting.

“Almost a month has gone by. Still none of this feels real,” Edmonds wrote in an update about her daughter’s condition.

The case raises broader legal questions about the responsibilities of technology companies when users discuss potentially harmful or violent scenarios on AI platforms.

The lawsuit alleges that OpenAI had prior knowledge of concerning conversations but did not notify authorities before the shooting occurred.

Legal proceedings will now determine whether the claims presented in the lawsuit establish liability under Canadian law.

(Rh)

Related Stories

No stories found.
logo
Medbound Times
www.medboundtimes.com