As part of the ongoing effort to help build a safe community on and off Facebook, the social media giant has announced to use AI tools outside of the US market to prevent suicides.
When someone is expressing thoughts of suicide, it’s important to get them help as quickly as possible and as Facebook is a place where friends and family are already connected, the network will use pattern recognition technology to detect posts or live videos where someone might be expressing thoughts of suicide.
The company has started rolling out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live.
“We use signals like the text used in the post and comments (for example, comments like “Are you ok?” and “Can I help?” can be strong indicators). In some instances, we have found that the technology has identified videos that may have gone unreported,” the company said.
Facebook uses “Community Operations” team that includes thousands of people around the world who review reports about content on Facebook. The team includes a dedicated group of specialists who have specific training in suicide and self-harm and uses artificial intelligence to prioritise the order in which our team reviews reported posts, videos and live streams.
This ensures we can get the right resources to people in distress and, where appropriate, we can more quickly alert first responders.
Facebook has also enhanced their tools to get people help as quickly as possible. For example, reviewers can quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.
In addition to those tools, the company uses automation wherein the team can more quickly access the appropriate first responders’ contact information.
Up till now, if someone posts something that makes you concerned about their well-being, you can reach out to them directly or report the post to Facebook. The social network provides people with a number of support options, such as the option to reach out to a friend and even offer suggested text templates. Facebook also suggests contacting a help line and offer other tips and resources for people to help themselves in that moment.
From past 10 years, Facebook has been working on suicide prevention tools and the approach was developed in collaboration with mental health organisations such as Save.org, National Suicide Prevention Lifeline, and Forefront Suicide Prevention and with input from people who have had personal experience thinking about or attempting suicide.