Instagram introduces ‘sensitivity screens’ to blur self-harm images

By Xite - February 7, 2019
instagram
In January of 2019, British Health Secretary Matt Hancock has asked Instagram to implement policies or features to protect youngsters from being exposed to harmful content or face legal action.

Photo-sharing app Instagram’s Head Adam Mosseri has announced to introduce ‘sensitivity screens’ to help younger users from being exposed to questionable content relating to self-harm and suicide.

Sensitivity screens will blur out images of self-harm and cutting until the user opts in and images that indicate self-harm will now no longer appear in search, recommendations and hashtag.

I was inspired to make these changes in the wake of the 2017 suicide of a British 14-year-old Molly Russell, who followed multiple self-harm and suicide accounts. Russell parents has blamed Instagram for her death.

‘We are not yet where we need to be on issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe. We already offer help and resources to people who search for such hashtags, but we are working on more ways to help,’ Mosseri wrote in the op-ed in Telegraph.co.uk.

Instagram has also begun removing inauthentic likes, follows and comments from accounts that use third-party apps to boost their popularity. Many accounts rely on third-party apps to artificially grow their audience which is easy to understand as an Instagram influencer with huge fan following, or more engagement rate, can charge more to advertisers to promote their products.

But, according to Instagram, this type of behaviour is bad for the community, and third-party apps that generate inauthentic likes, follows and comments violate Community Guidelines and Terms of Use.

  • Tags
  • Instagram
  • Social Media