On World Suicide Prevention Day, Facebook announced a trio of new initiatives designed to prevent suicide and self harm among its users.
Firstly, the company is hiring a health and well-being expert tasked with, among other things, helping Facebook address the issues of suicide and self-injury on the platform.
It’s also giving two academic researchers access to CrowdTangle, a social media monitoring tool, so they can analyze how Facebook users talk about suicide in the hopes the information could improve prevention and support efforts.
Finally, Facebook is adding Orygen’s #chatsafe guidelines, which are designed to help young people safely talk about suicide online, to its Safety Center.
Facebook also took the opportunity on Tuesday to detail its previous efforts to prevent suicide and self harm.
Those include a ban on “graphic cutting images” and any content promoting eating disorders, which the company has made harder to find on photo sharing site Instagram, which it also owns. Earlier this year, it also began placing a “sensitivity screen” over any Instagram images of healed self-harm cuts in an attempt to prevent the images from triggering unsuspecting users.
Ultimately, these past changes and the new initiatives are a welcome example of a powerful tech company taking steps to keep its users safe — and with suicide claiming a life every 40 seconds, such efforts are highly welcomed.
READ MORE: Facebook says it’s doing more to prevent suicide and self-harm [Engadget]
More on suicide: A Japanese City Is Using AI to Prevent Youth Suicides