Instagram Extends Its Restrictions on Content Depicting Suicide and Self-Harm

Drawings, memes, posts from films or comics that use graphic imagery are now covered

Instagram meets with academics and experts on suicide and self-harm on a monthly basis RyanKing999/iStock
Headshot of David Cohen

Instagram extended its efforts to prohibit content depicting suicide and self-harm from its platform to fictional depictions of those types of behaviors, such as drawings, memes and content from films or comics that use graphic imagery.

In February, head of Instagram Adam Mosseri revealed in a blog post that the Facebook-owned photo- and video-sharing network never allowed posts that promote or encourage suicide or self-harm, and graphic images would not be permitted, even in cases where those images would previously have been allowed as admission.

Mosseri added in February that non-graphic content related to self-harm, such as healed scars, would not be shown in search, hashtags or Instagram’s Explore tab, and it will not be recommended, saying that people who post such content would be directed to resources and organizations that can help.

In a blog post over the weekend, Mosseri detailed the new restrictions mentioned above, saying that imagery that does not depict self-harm or suicide but includes associated materials or methods will also be removed, and accounts sharing these types of content will not be recommended in search or Instagram’s discovery tools, including Explore.

He reiterated that people who post content along these lines will be directed to localized resourced including the National Suicide Prevention Lifeline and The Trevor Project in the U.S. and Samaritans and Papyrus in the U.K.

Mosseri wrote, “Two things are true about online communities, and they are in conflict with one another. First, the tragic reality is that some young people are influenced in a negative way by what they see online and, as a result, they might hurt themselves. This is a real risk. But at the same time, there are many young people who are coming online to get support with the struggles they’re having—like those sharing healed scars or talking about their recovery from an eating disorder. Often these online support networks are the only way to find other people who have shared their experiences.”

He added that in the three months following Instagram’s policy changes in February, it removed, reduced visibility of or added sensitivity screens to over 834,000 pieces of content, finding more than 77% of that content before it was reported.

According to Mosseri, Instagram meets with academics and experts on suicide and self-harm on a monthly basis, and it is working with Swedish mental health organization Mind to understand the role that technology and social media has on the lives of young people, as well as with Samaritans on an industrywide effort to shape new guidelines to help people in distress. He added that Instagram is working with its European regulator to ensure that its policies mesh with European Union law.

He concluded, “These are complex issues that no single company or set of policies and practices alone can solve. I’m often asked: Why do we allow any suicide or self-harm content at all on Instagram? Experts tell us that giving people a chance to share their most difficult moments and their stories of recovery can be a vital means of support, and that preventing people from sharing this type of content could not only stigmatize these types of mental health issues, but might hinder loved ones from identifying and responding to a cry for help. But getting our approach right requires more than a single change to our policies or a one-time update to our technology. Our work here is never done. Our policies and technology have to evolve as new trends emerge and behaviors change.”


david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}