HomeNews Feed

As the Part of New Safety Measures, Instagram Will Now Remove Any Self-Harm Related Images

1

Instagram chief Adam Mosseri has announced 4 new measures, which expand on the platform’s existing rules around similar content.

  1. We will not allow any graphic images of self-harm, such as cutting, on Instagram – even if it would previously have been allowed as admission. We have never allowed posts that promote or encourage suicide or self-harm, & will continue to remove it when reported.
  2. We will not show non-graphic, self-harm related content such as healed scars in search, (#) hashtags & the explore tab, & we won’t be recommending it. We are not removing from Instagram entirely this type of content, as we don’t want to stigmatize or isolate people who may be in distress & posting self-harm related content as a cry for help.
  3. We want to support people in their time of need so we are also focused on getting more resources to people posting & searching for self-harm related content & directing them to organizations that can help.
  4. Now we are going to continue to consult with our experts to find out what more we can do, with the sensitivity screen, this may include blurring any non-graphic self-harm related content so that images are not immediately visible.

The rules may have some expanded impacts on artistic works, but the relative impact, matched against the potential issues it may cause for those at risk, is incomparable, & it makes the sense for Instagram to take a stronger stance on this front.

To restrict the exposure of healed scars, there may be some expanded impacts over the decision, particularly as it may relate to people who incidentally show such within their images, but again, the impact is likely outweighed by the potential trauma it could cause, & limiting reach in this respect (note: such images will not be banned) makes sense. 

It’s also difficult to know what is the actual impact of these changes will be until they are enforced. Instagram’s machine learning detection systems are likely not advanced enough to pick out all scars in images – which would cause a lot of false positives either way – so you’d expect that there would be some level of reliance on user reports for such, & how Instagram rules on them is impossible to know until the process in practice.  

For its part, Instagram acknowledges the complexity of this issue, & has pledged to continue working towards the best solution:

“Up until now, we have focused most of our approach on trying to help the individual who is sharing their experiences around self-harm. We have allowed the content that shows contemplation or admission of self-harm because experts have told us it can help people get the support they need. But we still need to do more to consider the effect of these images on other people who might see them.” 

For the social networks given the potential to influence mental state, particularly as it relates to depression, it’s important that all platforms continue to investigate what they can do, & seek out better solutions to protect vulnerable users.

According to few reports, is the worst social network in this respect a study conducted by The Royal Society for Public Health in the UK in 2017 showed that the usage of Instagram had the biggest potential impact in relation to higher levels of anxiety, depression, bullying & “fear of missing out.” 

It may take some time to get right, & there could potentially be unintended impacts, but Instagram’s latest moves on this front are worthy of encouragement.

Some latest Social Media trends to Watch in 2019

Previous article

To simplify multiple account management, Instagram developing an account linking feature

Next article

You may also like

1 Comment

  1. you have an amazing blog right here! would you wish to make some invite posts on my blog?

Leave a reply

Your email address will not be published. Required fields are marked *

More in Home