You knew it was going to happen sooner or later: someone committing suicide on Facebook live.
22-year old Erdogan Ceren from Turkey was in front of his computer camera holding a shotgun. He told anyone watching that “No one believed when I said I will kill myself, so watch this.” Ceren was reportedly distraught over a break-up with his girlfriend.
He tried to shoot the gun, but it failed on the first attempt, so he did it again. He was found by relatives and taken to a hospital. He died 12 hours later.
Facebook took down the video, but you can find it in various sources online if you really want to see it.
Back in 2015, Facebook saw the potential for things like this to happen and took some pro-active steps. In a partnership with University of Washington’s School of Social Work, Facebook put together a way to flag posts suggesting suicide or depression.
It works like this. If you see a post from someone and think they might be in trouble, you can report it to Facebook.
That reporting activates a series of responses. The person who flags the post will see a screen with links that allow them to message the potentially suicidal person, contact another Facebook friend for support or connect with a trained professional at a suicide helpline for guidance.
Facebook will then review the reported post. If the poster is thought to be in distress, a series of screens will be launched automatically when that person next logs onto Facebook, with suggestions for getting help. – University of Washington
You can also report it through this screen.
“We have teams working around the world, 24/7, who review any report that comes in. They prioritize the most serious reports, like self-injury, and send help and resources to those in distress.” – Rob Boyle, Facebook Product Manager