Archive for October, 2025

ChatGPT Linked to Teen Suicides

Wrongful Death Case Raises Ethical Questions About AI in Mental Health Support

AI chatbots can’t replace licensed mental health professionals

It goes without saying that one of the biggest stories of the last several years has been the rise of generative AI products such as ChatGPT. AI is increasingly used for professional and personal purposes, and when used safely, it can be a useful tool. However, talking to ChatGPT is no substitute for real human interaction, and sometimes, using it that way can be deadly.
NBC News recently reported on the story of a teenager, Adam Raine, who died by suicide after extensive communications with ChatGPT. According to the NBC article, the bot went from helping him with his homework to “becoming his ‘suicide coach,'” acknowledging and even encouraging his suicide attempts.
 “He would be here but for ChatGPT. I 100% believe that,” his father, Matt Raine, told NBC.

High-profile deaths by suicide are indicative of a larger problem

Suicides linked to the use of AI chatbots have drawn significant attention this year. Mr. Raine and another grieving parent, Megan Garcia, even testified at a congressional hearing last month. Both have brought lawsuits against AI companies.
These concerns about chatbots and suicide risk are part of a larger conversation about the risks of generative AI in mental health. A recent Stanford study, for example, found that AI chatbots are ineffective and dangerous alternatives to human therapists.
The researchers noted that AI models reinforced stigma toward mental health conditions, like alcohol dependence and schizophrenia, which can lead at-risk patients to become frustrated and even discontinue mental health care.
More alarmingly still, the Stanford study tested AI chatbots’ responses to suicidal ideation and other dangerous behaviors in a conversational setting. In these scenarios, the researchers found that the chatbots would actually enable dangerous behavior.
Notably, the chatbots examined in the Stanford study were designed specifically to work as “therapy bots.” A generalized AI chatbot like ChatGPT might be even more dangerous when confronted with warning signs of a mental health crisis.

While AI may have some applications in mental health, it can’t replace human intervention

That’s not to say that AI tools have no place in mental health care. Last year, the American Psychological Association wrote that AI can be used as part of psychological practice to detect warning signs of mental health concerns, monitor patients’ symptoms, and even aid in clinical decision-making. The key, however, is that it should be used as a tool for a well-trained, experienced, human mental health professional, not a replacement.
Certainly, the tragic losses of multiple teens due to the use of generative AI are a warning that parents need to more closely monitor their children’s technology use and respond to any warning signs of suicide. But there’s a bigger takeaway here: the need for human connection in an increasingly technologically driven world.
People who are at risk of suicide or another mental health crisis need to be surrounded by other people who know them, know the warning signs, and can recommend the right resources. Just as importantly, they need access to real mental health treatment instead of leaning on unreliable and often dangerous generative AI “therapy bots.”

Our law firm stands up for families who have lost loved ones to suicide

These stories about generative AI are a sobering reminder that suicide is preventable with the right interventions. Unfortunately, too many families lose loved ones because the people responsible for their safety didn’t do their jobs. Our mission is to fight for justice and accountability for those families.
If you have lost a loved one to suicide completion, we are prepared to listen to your story and explain your legal rights and options. Schedule your free consultation with the Law Offices of Skip Simpson today. We serve families throughout the United States.

September is National Suicide Prevention Awareness Month

Suicide is preventable.

We’re closing out National Suicide Prevention Awareness Month this September, but the truth is that suicide prevention needs to be a year-round focus. People who die by suicide show warning signs beforehand, and if the people in their lives know what to look for, they can intervene. And those efforts are not futile, because suicide is not inevitable. The right interventions can save lives.

This month and every month, let’s remain committed to suicide prevention.

How friends and family can help prevent suicide

According to the National Institute of Mental Health, the first step to suicide prevention is to ask if you have reason to suspect someone is thinking about suicide. Remember, study after study has shown that asking about suicide does not increase suicidal behavior or thoughts. To the contrary, asking is the best way to start the conversation and build a connection with someone who is at risk. And that’s critical, because studies have also shown that listening, acknowledging, and talking about suicide can actually help to reduce suicide risk.

Another important step is to limit access to lethal means. Limiting access to firearms is especially important because guns are much deadlier than other commonly used suicide methods. Other lethal means, including knives, medications, and loopables (any item that can be used to make a noose), likewise need to be safely stored to reduce access, especially when the suicidal person is alone.

It’s critical to refer the at-risk person to mental health resources. The 988 Lifeline is a valuable first point of contact for people who are in immediate crisis. Depending on the situation, a person at risk of suicide may need inpatient or outpatient mental health treatment or other medical services.

Finally, loved ones need to follow up and stay connected with the at-risk person. The immediate crisis may have passed, but the underlying issues that led them to become suicidal may still be there, and a lack of connection is one such risk factor. Staying in ongoing, supportive contact after a mental health crisis can dramatically reduce suicide risk.

The role of medical professionals in suicide prevention

As the American Association of Suicidology puts it, suicide is everyone’s business. We all have a role to play in preventing suicide and ensuring that those at risk of dying by suicide get the support and resources they need. However, medical professionals have a particularly significant role to play, both because they work with at-risk people every day, and because they have specialized training and responsibility for their patients’ health.

Unfortunately, it’s far too common for physicians and other medical professionals to fail to take important, medically indicated steps to reduce the risk of patient suicide. When that happens, lives can be lost unnecessarily. Our job is to hold them accountable.

If you have lost a loved one to preventable suicide, contact us

Too many families are left to rebuild their shattered lives after losing a loved one to suicide completion. Our mission is to fight for accountability and justice for those families. We would be honored to listen to your story and explain your legal rights and options.

Contact us online today for a free, confidential consultation with the Law Offices of Skip Simpson. We’re based in Texas and serve families nationwide.