Facebook to use AI to detect suicidal post

Facebook to use AI to detect suicidal post

Can a software save lives? Yes — facebook made it possible. It rolled out an AI to scan posts or live users to check the pattern of suicidal thoughts. Once AI detects suicidal mentality, facebook will identify and contact the first responder instead of waiting for someone to report that.

Facebook is the place where family and friends are already on board and facebook can help distress person to connect with the people who can help him in this situation.

Facebook is working on this from very long to help the people who are sharing the thought of suicide via text or live video. The process includes

  • Match the pattern of suicidal thought from the text post or live video.
  • Improving to identify the first responder from family and friends.
  • Dedicating more reviewers from Community Operations team to review reports of suicide or self harm

Facebook previously tested this AI in US and now rolling out for the rest of the world excluding EU where privacy laws complicate the use of this tech.

Facebook suicide prevention

“This is about shaving off minutes at every single step of the process, especially in Facebook Live” says VP of product management Guy Rosen. Over the past month of testing, Facebook has initiated over 100 “wellness checks” with first-responders visiting affected users. “There have been cases where the first responder has arrived and the person is still broadcasting.”

What is the current process

For now all the process is manual and user needs to report for such kind of posts. For example : if you found some post with such thought you will directly need to reach out to the user or report the post to the facebook. Facebook is 24/7 working teams who receive reports, prioritize them and provide people with a number of support options, such as the option to reach out to a friend and even offer suggested text templates.

Facebook suicide prevention

Expanding the process

Manual process is slow and dependent of someone to report. In the new process facebook is using AI to scan the posts to match the pattern of suicidal thoughts. In this case any person need not to report the post but AI will use the pattern matching technology to identify the people who is expressing the thoughts of suicide. Facebook trained the AI by finding patterns in the words and imagery used in posts that have been manually reported for suicide risk in the past. It also looks for comments like “are you OK?” and “Do you need help?”

Facebook is continuously working on this to improve the technology and to make it more accurate.