Suicide is extremely difficult to predict, especially in depressed individuals.

But AI- algorithms could soon help doctors differentiate between patients who are depressed and those who are suicidal.

Joseph Franklin, Ph.D. from Harvard University and author of a suicide research published by the American Psychological Association said,

a suicide expert who conducted an in-depth assessment of risk factors would predict a patient’s future suicidal thoughts and behaviors with the same degree of accuracy as someone with no knowledge of the patient who predicted based on a coin flip.

According to the World Health Organization (WHO), as many as 800,000 people die due to suicide every year, with 60 percent facing major depression. Even though depression places a patient at a higher risk of engaging in the suicidal behavior, the difference between suicidal depressed and just depressed individual is not easy to detect.

So far, standardized clinical trials predicting the factors leading to suicide have proved to be cumbersome and may not always translate into routine interactions between clinicians, caregivers and educators. But a recent research study may have found an answer.

Authored in collaboration with scientists at USC, Carnegie Mellon University, and Cincinnati Children’s Hospital Medical Center, the report investigates non-verbal facial behavior to detect suicidal risks and claims to have found a pattern that differentiates depressed and suicidal patients.

The dataset used in the research comprised of interviews with subjects from the Cincinnati Children’s Hospital Medical Center, the University of Cincinnati Medical Center and the Princeton Community Hospital. The subjects were broadly assigned to one of the three groups — the medical control group, those suffering from depression and suicidal ideation.

They were asked five open-ended questions from interviewers — “Do you have hope?”, “Do you have fear?” “Do you have any secrets?”, “Are you angry?” and “Does it hurt emotionally?” The questions were designed to generate further conversation related to the patients’ conditions and past experiences.

Based on past literature, the research looked into reactions based on four critical facial behaviors — smiling, frowning, head movement and eyebrow-raising. The received responses were recorded on video and audio and then fed into a machine-learning algorithm, a supervised learning model known as support vector machine or SVM. The model tried to find correlations between patient groups and their facial behaviors. Through the SVM prediction models, the research study revealed that smile descriptors were the most vital features for predicting suicidal cases when compared to frowning, eyebrow-raising and head velocity.

The study noticed that smiling dynamic indicator, Duchenne smile, in particular, can be a strong behavioral indicator of depression and suicidality.

In the past, literature has shown that reduced presence of Duchenne smile could indicate depression and psychosis and have been useful in differentiating genuine and posed smiles. Named after a French physician with a fondness for electrodes, a Duchenne smile involves crinkling of eyes, while the absence of such smile does not involve contraction of muscles surrounding the eyes.

The research paper showed that people displaying non-Duchenne smiles usually masked their negative emotions, having more suicidal ideation than those with a Duchenne smile. In other words, the subjects in mental health and suicidal groups smiled with less intensity and had a low percentage of a Duchenne smile than those in the control group.

But for such AI tools to accurately predict suicidal behavior, due diligence needs to be provided. Extreme Tech has noted that if such tools fall in the hands of insurance companies, “the results could be distinctly unsavory”.

A crucial point deducted in a 2012 research study at Northeastern University stated that it is possible to fake a “genuine” Duchenne smile, a conclusion that could negate that Duchenne smile stands as a strong suicidal indicator.

Whether AI will successfully bring down the suicidal rates through machine learning algorithms remains to be seen, but recognizing patterns of suicidal behavior itself means a step towards saving lives.

If you’re facing distress or suicidal crisis in the U.S., you can immediately talk with someone at the National Suicide Prevention Lifeline (800–273–8255, suicidepreventionlifeline.org) or the Crisis Text Line (text HOME to 741–741).