In additional to the National Suicide Prevention hotline, Facebook itself has a way of reporting potential suicidal intention. As the piece below explains, one can report a friend who posts something that might be construed as suicidal. While the system is not fool-proof, it is definitely a good use of modern technology and social media.
Facebook Tries to Find the Right System for Flagging Suicidal Behavior
By Alexis Madrigal
Dec 16 2011, 1:14 PM ET You’re on Facebook one day when you notice that an acquaintance — not someone really close, but a person with whom you’re friendly — posts a status update that seems despondent. Something like, “Man, life doesn’t seem worth it. I can’t take it.” You look for an explanation on the person’s profile, wonder if it’s some kind of inside joke. But it dawns on you that it might be an honest expression of emotional pain, perhaps a cry for help.
What do you do?
It’s a difficult social problem. It’s not like you’re a close friend of the person and would feel comfortable asking him to pour his heart out to you. Maybe you’ve only met him once. You very well might do nothing.
Facebook is trying to offer a new avenue to help you solve this new dilemma of the digital world. They have inserted the ability to anonymously flag someone as someone who might be suicidal. This is a very delicate user interaction design, obviously. On the one hand, Facebook wants people to be able to report real suicidal behavior, but they also don’t want to create an obvious target for people who want to create mischief. Where they place the reporting mechanism as well as the behind-the-scenes processes for dealing with user reports could have very real consequences.
Let me spell out the compromise Facebook has come to. I think it is debatable, but there probably is no perfect answer in this situation. It’s just weird to find ourselves in the situation of seeing expressions of suicidal ideation from people we don’t know well.
The suicidal behavior reporting button is located within the normal mechanism for reporting questionable content. But it’s *not* on the first menu of options, as you can see on the left. Instead, you have to mark something “Violence or harmful behavior” before you see the option to report “Suicidal Content.” This seems suboptimal to me as I wouldn’t think to put suicidal behavior into that category. A Facebook spokesperson told me, “We have been, and will continue to, work with the suicide prevention community and iterate on the placement of the Suicidal Content button.”
Nonetheless, after someone makes this kind of report, they receive a followup email from an actual human being to whom they can respond. It reads like this:
We will do our best to assist you with this matter. Please describe the problem you are experiencing with Facebook in as much detail as possible and include any relevant web addresses (URLs). More detailed information will help us investigate the issue further.
Thanks for contacting Facebook,
[Name]
According to Facebook, they have an internal “systems to prioritize the most serious reports, and a trained team of reviewers who respond to reports and escalate them.” That’s good because on this issue, it doesn’t seem like an algorithm could make the subtle discriminations necessary to offer people the kind of help they need while filtering out pranks.
If the internal infrastructure finds that the person reported is exhibiting suicidal behavior, they’ll be offered a private chat session with someone from the National Suicide Prevention Lifeline, as well as the organization’s phone number.
Pingback: Suicide Prevention on Facebook: Good Luck with That « The World of Pastoral and Spiritual care