logo
Plus   Neg
Share
Email

Facebook Not To Allow Graphics Promoting Self-harm

facebook-earnings-013019_11sep19-lt.jpg

Social media giant Facebook is tightening its policies and expanding its resources to tackle and prevent suicide and self-harm indulged in by people in the global online community using their apps. The update to the policies came as part of the commemoration of World Suicide Prevention Day.

Facebook has already been working with the advice of global experts since 2006 to set policies for what is and is not allowed on the platform to support those at risk of suicide or self-injury. The company regularly shares updates with the online community on how they enforce their policies against self harm or suicide.

As such, Facebook lets people share admissions of self harm and suicidal thoughts so their friends and family have an opportunity to reach out, offer support and provide help or resources. It also works with emergency responders when there's risk of imminent harm. However, it does not allow people to share content that celebrates or promotes self harm or suicide.

Based on regular interactions with experts, Facebook is now made several changes to improve how the harmful content is handled.

It has tightened the policy around self-harm by no longer allowing graphic cutting images to avoid unintentionally promoting or triggering self-harm. Even on Instagram, the company has made it more difficult to search for self-harm promoting content and kept it from being recommended to explore.

Facebook is also tightening its policy to prohibit additional content that may promote eating disorders on its apps. Further, the company has chosen to display a sensitivity screen over healed self-harm cuts to help avoid unintentionally promoting self-harm.

In its bid to further improve policies, Facebook said it will hire a health and well-being expert to join its safety policy team to focus exclusively on the health and well-being impacts of its apps and policies. The expert will explore new ways to improve support for the online community, including on topics related to suicide and self-injury.

For the first time, Facebook said it will also explore ways to share public data from its platform on how people talk about suicide. It will provide two select academic researchers access to the social media monitoring tool CrowdTangle to explore how information shared on Facebook and Instagram can be used to further advancements in suicide prevention and support.

Facebook noted that it has taken action on more than 1.5 million pieces of suicide and self-injury content on Facebook from April to June of 2019 and found more than 95 percent of it before it was reported by a user. Similarly, it also took action on more than 800 thousand pieces of the content on Instagram during the same period and found more than 77 percent of it before it was reported by a user.

For comments and feedback contact: editorial@rttnews.com

Business News

Follow RTT
>