Facebook Now Offers Help If You Think Your Friend Is Suicidal
15 June 2016, 10:58 | Updated: 8 May 2017, 17:09
A new feature has just been launched.
Facebook this week has launched a full roll-out of its new feature designed to prevent suicide among users.
If you see a worrying post on the site, you can now anonymously alert Facebook admin who will then send the user a message offering information about support outlets and advice.
Already tested in the US and Aus, the messages will also provide links to groups like The Samaritans or the option to chat directly with a friend who sent the alert.
It's a problem Facebook, and social media in general, has struggled to handle for years now. We've all seen people use sites like Facebook as a sounding board; a place to vent grievances and issues in their day to day. But when we see a genuine cry for help, it is all to easy to miss it among the cat videos and photo albums clogging up the remainder of the news feed. Even worse, if a tragedy should occur, the profile remains online as a static reminder of how powerless we were to help.
Facebook itself can also sadly fuel a level of depression, creating a sort of outsider effect, those already unhappy suddenly bombarded by images of seemingly "happy" peers, leading to further feelings of isolation.
If people can start to talk about the unbearable pain that they're facing, we can interrupt that journey towards suicide. Suicide is not inevitable, it is preventable. This tools plays a really vital role in achieving that.Ruth Sutherland, Samaritans CEO, Newsbeat
Talking about depression and mental health sadly still remains somewhat taboo and any steps we can take as a society to open up channels of conversation is clearly important. We previously wrote about high profile stars like Gerard Way and Olly Alexander discussing their mental health struggles and raising awareness of the problem. The support on offer through the links provided by Facebook is a great step in the right direction but does it go far enough? The difficulty of course will be how to identify people in genuine states of distress who post in a more cryptic fashion, perhaps ashamed to admit their own feelings. Furthermore, are Facebook able to track follow up care? You cannot force a person to seek help so how many times will they reject advice before the feature admits defeat? Regardless, any new avenue for help is surely a good thing, no matter how experimental.