SAM app helps counselors, parents detect language of teen suicide

Any parent of a teen knows all too well the ups and downs of those tumultuous years.

At best, parents feel utterly infuriated. At worst, completely helpless and shaken to the core — especially if kids harm, or threaten to kill, themselves.

Parents are conditioned to look and listen for cries for help. Turns out they may miss a lot.

Suicide still remains the third leading cause of death of kids age 10 to 14, and is second among people aged 15 to 34 years old. Every 14 minutes someone commits suicide.

'Amazing, really amazing'

Enter John Pestian and his team of researchers with big brains and even bigger hearts.

Pestian, a professor in the divisions of Biomedical Informatics and psychiatry at Cincinnati Children’s Hospital Medical Center, has spent nearly a decade immersed in the language of suicide in an effort to try to keep kids alive.

What he has found is there are indicators in spoken language that can help school counselors and medical professionals identify when kids are at risk for suicide or when they may be suffering from a mental illness but are not at risk for suicide.

Essentially, there is science behind the words we use, in additional to non-verbal clues like pauses, tone and pitch, that are what Pestian calls "thought markers," according to his most recent published work earlier this month by the official journal of the American Association of Suicidology.

Using that information, technology wizards built algorithms to power an app being studied in eight Cincinnati-area schools this year. The app, which looks incredibly simple, is called SAM and stands for Spreading Activation Mobile. SAM records a teen's conversation during counseling sessions. It uses Pestian's technology to measure the words the teens use to determine if the language is similar to someone who is at risk for suicide. The app also detects when a teen's language is just the speech of a typical teen: It might be full of angst, but it's not akin to suicidal speech.

"It's amazing, it's really amazing," Pestian says.

His work has been repeatedly called groundbreaking.

Machine can't replace humans

In the recently published study, conducted from October 2013 to March 2015 in three different emergency departments, suicidal subjects were found to laugh less and sigh more. They also expressed more anger and exhibited less hope than those who were deemed to suffer from mental illness or were neither.

The study showed that the computer technology, known as machine learning, is up to 93 percent accurate in correctly classifying a suicidal person and 85 percent accurate in identifying a person who is suicidal, has a mental illness but is not suicidal, or neither.

An ongoing experimental phase of his research includes video recording the conversations to study facial movements to see if there are further clues embedded in the face. Early research seems to indicate, for example, that non-suicidal teens showed more teeth in a 90-minute conversation than suicidal teens. And suicidal teens gazed down six times during a 90-minute interview, while their non-suicidal counterparts looked down just twice, he said.

"These are things you just don't pick up in conversation,'' he says.

Pestian is quick to point out, however, that artificial intelligence can only go so far. It is not — and never will be, nor should be — a replacement for clinicians, therapists or others who work with teens.

"The technology is not going to stop the suicide, the technology can only say: "We have an issue over here,'' says Pestian, whose work has won patents and has been published around the world.  "Then we have to intervene and get a path to get to care.

"If it's just a machine it is useless,'' he says.

Suicide every 14 minutes

Pestian first started his work by studying language written in notes of 1,319 people shortly before they committed suicide. Then they moved onto interviewing 379 teenage patients at three hospitals, including Cincinnati Children's. Some of the teenage patients were in the emergency rooms because they were exhibiting suicidal thoughts or tendencies, while others went there for other conditions.

He still shakes his head at how prevalent suicide is among teens.

Based on a 2013 Centers for Disease Control report:

— 17 percent of high-school age kids seriously considered attempting suicide in the previous 12 months;

— 13.6 percent had made a plan about how they would attempt suicide and 8 percent of students attempted suicide one or more times.

— Suicide results in an estimated $51 billion in combined medical and work loss costs.

"The purpose of SAM is to find the kids earlier, because suicide is an avoidable death; it's preventable death. If we just get them early and if we know what to look for; if the parents know what to do," Pestian says. "It's miserable on the family. It hurts just to call."

Ben Crotte, a behavioral health therapist the Children's Home of Cincinnati and lead therapist at the School for Creative and Performing Arts, sees great promise in Pestian's work. He tested SAM last year and is using it again this year with his clients, all of whom have been previously diagnosed with mental illness, as part of the ongoing research.

"It's extremely easy to use. It's super simple,'' Crotte says, adding that all of his clients know he is using it and they are involved in the study. "It's really a minimal thing we are incorporating into our work."

If you or someone you know needs help:

National Suicide Prevention Lifeline, 800-273-8255


JOIN THE CONVERSATION

To find out more about Facebook commenting please read the
Conversation Guidelines and FAQs

Leave a Comment