Facebook and Instagram Made Changes to Their Policies on Self-Harm, Suicide

The moves were spurred in part by the death of British teen Molly Russell

Facebook consulted with more than one-dozen experts
Highwaystarz-Photography/iStock

Facebook and Instagram separately detailed changes they are implementing to their policies on content related to self-harm and suicide.

The changes were in part a reaction to Molly Russell, the British teen who took her own life in 2017 at the age of 14. The tragedy has returned to the headlines due to her father’s fight for access to her data from Instagram and other social platforms.

Facebook global head of safety Antigone Davis said in a Newsroom post that Facebook consulted with more than one-dozen experts in suicide prevention and safety in crafting four changes to how it enforces its policies.

The social network will continue to allow people to share admissions of self-harm and suicidal thoughts, but content that promotes such actions will be removed.

Davis said Facebook’s stable of experts felt that it was important to allow people to discuss challenges they are facing, as doing so often helps them connect with support and resources, but graphic images of self-harm—particularly cutting—can unintentionally promote such actions, and those images will no longer be permitted, with enforcement beginning in the coming weeks.

Discussions with its experts on whether content such as healed cutting scars in stories of recovery could unintentionally promote self-harm were inconclusive, and Davis said Facebook will continue to monitor the latest findings on that topic.

Finally, Davis said Facebook’s experts stressed the importance of providing people with products that connect them with help and resources and that help avoid shaming people going through these sorts of difficulties.

She added, “We will continue to provide resources, including messages with links to helplines, and over the coming weeks we will explore additional steps or products we can provide within Instagram and Facebook.”

Adam Mosseri, head of Instagram, discussed similar plans and changes in a blog post of his own.

Mosseri stressed that Instagram has never allowed posts that promote or encourage suicide or self-harm, and it will not allow any graphic images of self-harm, even in cases where those images would previously have been allowed as admission.

Non-graphic content related to self-harm, such as healed scars, will not be shown in search, hashtags or Instagram’s Explore tab, and it will not be recommended.

Mosseri wrote, “We are not removing this type of content from Instagram entirely, as we don’t want to stigmatize or isolate people who may be in distress and posting self-harm related content as a cry for help.”

Instagram will also focus on directing people who post such content to resources and organizations that can help.

Finally, Mosseri said, “We’re continuing to consult with experts to find out what more we can do. This may include blurring any non-graphic self-harm related content with a sensitivity screen so that images are not immediately visible.”

Davis shared the following list of experts who have been working with Facebook on this matter:

  • Brazil: Instituto Vita Alere, Karen Scavacini; Safernet, Thiago Tavares and Juliana Cunha
  • Bulgaria: Safer Internet, Georgi Apostolov
  • Canada: Kids Help Phone, Alisa Simon
  • India: ICALL Tata Institute for Social Sciences, Aparna Joshi
  • Mexico: University of Guadalajara, Neurosciences Department, Luis Miguel Sanchez-Loyo
  • Philippines: Philippines General Hospital, Child Protection Unit, Norie Balderrama
  • Thailand: Samaritans Thailand, Trakarn Chensy
  • U.K.: Bristol University, Bristol Medical School, Lucy Biddle; Centre for Mental Health, Sarah Hughes; The Mix U.K., Chris Martin; Samaritans, Jacqui Morrissey; Papyrus, Ged Flynn and Lisa Roxby
  • U.S.: Save.org, Dan Reidenberg; The National Suicide Prevention Lifeline, John Draper

She wrote, “Bringing together more than one-dozen experts from around the world, many of whom helped us develop our policies in the first place, we asked them how we could better weigh two important goals that are sometimes at odds: the opportunity to get help and share paths to recovery for people who might be in harm’s way, and the possibility that we may unintentionally promote self-harm or remove content that might shame or trigger the poster to self-harm.”

And Mosseri wrote, “Our aim is to have no graphic self-harm or graphic suicide-related content on Instagram and to significantly reduce—with the goal of removing—all self-harm and suicide imagery from hashtags, search, the explore tab or as recommended content, while still ensuring that we support those using Instagram to connect with communities of support.”

He continued, “We need to create a safe and supportive community for everyone—but this not as simple of just switching off a button. We will not be able to remove these images immediately, and we must make sure that people posting self-harm related content do not lose their ability to express themselves and connect with help in their time of need. We will get better and we are committed to finding and removing this content at scale.”

Recommended articles