China Develops Suicide Detecting AI Bot

December 10, 2019

Most AI bots are used for customer support, massive online postings, downloading stuff, and criminal mischief. China has found another use for AI bots: detecting potential suicides. The South China Morning Post shared the article, “This AI Bot Finds Suicidal Messages On China’s Weibo, Helping Volunteer Psychologists Save Lives.” Asian countries have some of the world’s highest suicide rates. In order to combat death, Huang Zhisheng created the Tree Hole bot in 2018 to detect suicidal messages on Weibo, the Chinese equivalent of Twitter. Tree Hole bot finds potential suicide victims posting on Weibo, then connects them with volunteers to discuss their troubles. Huang has prevented more than one thousand suicides.

In 2016, 136,000 people committed suicide in China, which was 17% of world’s suicides that year. The World Health Organization states that suicide is the second leading cause of death in people ages 15-29. Other companies like Google, Facebook, and Pinterest have used AI to detect potential suicidal or self-harmers, but one of the biggest roadblocks are privacy concerns. Huang notes that saving lives is more important than privacy.

The Tree Hole bot works differently from other companies to find alarming notes:

“The Tree Hole bot automatically scans Weibo every four hours, pulling up posts containing words and phrases like “death”, “release from life”, or “end of the world”. The bot draws on a knowledge graph of suicide notions and concepts, applying semantic analysis programming so it understands that “not want to” and “live” in one sentence may indicate suicidal tendency.

In contrast, Facebook trains its AI suicide prevention algorithm by using millions of real world cases. From April to June, the social media platform handled more than 1.5 million cases of suicide and self-injury content, more than 95 per cent of which were detected before being reported by a user. For the 800,000 examples of such content on Instagram during the same period, 77 per cent were first flagged by the AI system first, according to Facebook, which owns both platforms.”

Assisting potential suicide victims is time consuming and Huang is developing a chatbot that can hopefully take the place of Tree Hole volunteers. Mental health professionals argue that an AI bot cannot take the place of a real human and developers point out there is not enough data to make an effective therapy bot.

Suicide prevention AI bots are terrific, but instead of making them volunteer only would it be possible, at least outside of China to make a non-profit organization staffed by professionals and volunteers?

Whitney Grace, December 10, 2019

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta