Following in the footsteps of other social media sites, Instagram has announced that it will delete all sorts of self harming pictures and videos in the near future as they want to make sure that none of its users get affected by the platform with something so dangerous.
This all started when British Health Secretary Matt Hancock called on representatives from all of the social media giants to discuss the matter of protecting mental health of teenagers and actions that should be taken immediately to clamp down the violent content.
As a proof, Matt even highlighted the case of a 14 year old British teenager, Molly Russell, who took her own life back in 2017 and upon extensive investigation, it was reported that she followed a lot of depression and suicide related stuff on Instagram.
Adam Mosseri, the Head of Instagram, has admitted that now is indeed the right time to start taking responsibility of such sensitive users on the internet and Instagram is already incorporating changes in its content policies with the help of experts and academics from around the world on youth, mental health, and suicide.
Moving on, Instagram won’t allow any graphic images of self-harm, even if any such thing was previously allowed as admission. However, the company never wanted such posts to encourage suicide before. They will also remove references to non-graphic content from all the search, hashtag, explore, or recommendation features.
"We’re continuing to consult with experts to find out what more we can do.", announced Adam Mosseri, Head of Instagram. "This may include blurring any non-graphic self-harm related content with a sensitivity screen, so that images are not immediately visible."
Along with this, Instagram aims to provide its users a supportive community and are therefore working constantly to create counseling opportunities for people who post or search things related to self harm.
While it is a great beginning to lessen down the negative impact of internet overall, these companies together can create safeguards which will revamp the whole online community for better.
Photo Courtesy of Christophe Wu
This all started when British Health Secretary Matt Hancock called on representatives from all of the social media giants to discuss the matter of protecting mental health of teenagers and actions that should be taken immediately to clamp down the violent content.
As a proof, Matt even highlighted the case of a 14 year old British teenager, Molly Russell, who took her own life back in 2017 and upon extensive investigation, it was reported that she followed a lot of depression and suicide related stuff on Instagram.
Adam Mosseri, the Head of Instagram, has admitted that now is indeed the right time to start taking responsibility of such sensitive users on the internet and Instagram is already incorporating changes in its content policies with the help of experts and academics from around the world on youth, mental health, and suicide.
Moving on, Instagram won’t allow any graphic images of self-harm, even if any such thing was previously allowed as admission. However, the company never wanted such posts to encourage suicide before. They will also remove references to non-graphic content from all the search, hashtag, explore, or recommendation features.
"We’re continuing to consult with experts to find out what more we can do.", announced Adam Mosseri, Head of Instagram. "This may include blurring any non-graphic self-harm related content with a sensitivity screen, so that images are not immediately visible."
Along with this, Instagram aims to provide its users a supportive community and are therefore working constantly to create counseling opportunities for people who post or search things related to self harm.
While it is a great beginning to lessen down the negative impact of internet overall, these companies together can create safeguards which will revamp the whole online community for better.
Photo Courtesy of Christophe Wu