Facebook is a giant social media platform that has always tried to minimize showing any type of content on its platform that can hurt the users' self-respect or negatively affect the users in any way. Lately, Facebook seemed to intensify the importance of fighting against the content that promotes child abuse and has introduced new tools to minimize this on its platform.
Facebook mentioned that using its app for intending to target children to hurt and harm them is not bearable by the platform under any circumstances. The Global Health of Safety named Antigone Davis informed in a blogpost that new tools are being tested on the platform that includes some improvements- the company has brought to its detection and reporting tools.
Let's see how the new tools by the social media giant will prevent its users from sharing images, videos, or any other content that promotes child sexual abuse material (CSAM) on the platform. Two steps might be followed by the company, according to which firstly it will warn users whenever they will try to share content that will contain CSAM material. Secondly, it will send notifications to the users if they will try to search for content that contains child abuse. Additionally, the pop-up will warn you of the consequences which you might have to face after viewing illegal content.
The first action is taken to limit the users who have a non-malicious intend behind sharing this content with others. While the second one is to stop users from finding such content after searching for it and then using it for commercial purposes. The company is much concerned about limiting the content promoting child sexual abuse and wants to leave no source through which a single child may have the chance of facing it, said Karuna Nain, while reporting to the media on Zoom call; who is a Global Safety Policy lead at Facebook.
These new tools are followed by Facebook’s in-depth study of illegal child exploitative content that the company reported to the NCMEC; the US National Center for Missing and Exploited Children which was of October and November last year. Leading this, Facebook’s own admission identified that in the fourth quarter of 2020, it was successful in removing 5.4 million types of posts promoting child sexual abuse. The number of posts being removed on Instagram was recorded to be 800,000. Facebook says that 90 percent of the content was similar to the one that was already reported and was not anything new for the company as the content got shared with a lot of repetition.
Read next: Facebook’s Watch Party Feature to Bid Farewell In The Coming Weeks (updated)
Facebook mentioned that using its app for intending to target children to hurt and harm them is not bearable by the platform under any circumstances. The Global Health of Safety named Antigone Davis informed in a blogpost that new tools are being tested on the platform that includes some improvements- the company has brought to its detection and reporting tools.
Let's see how the new tools by the social media giant will prevent its users from sharing images, videos, or any other content that promotes child sexual abuse material (CSAM) on the platform. Two steps might be followed by the company, according to which firstly it will warn users whenever they will try to share content that will contain CSAM material. Secondly, it will send notifications to the users if they will try to search for content that contains child abuse. Additionally, the pop-up will warn you of the consequences which you might have to face after viewing illegal content.
The first action is taken to limit the users who have a non-malicious intend behind sharing this content with others. While the second one is to stop users from finding such content after searching for it and then using it for commercial purposes. The company is much concerned about limiting the content promoting child sexual abuse and wants to leave no source through which a single child may have the chance of facing it, said Karuna Nain, while reporting to the media on Zoom call; who is a Global Safety Policy lead at Facebook.
These new tools are followed by Facebook’s in-depth study of illegal child exploitative content that the company reported to the NCMEC; the US National Center for Missing and Exploited Children which was of October and November last year. Leading this, Facebook’s own admission identified that in the fourth quarter of 2020, it was successful in removing 5.4 million types of posts promoting child sexual abuse. The number of posts being removed on Instagram was recorded to be 800,000. Facebook says that 90 percent of the content was similar to the one that was already reported and was not anything new for the company as the content got shared with a lot of repetition.
Read next: Facebook’s Watch Party Feature to Bid Farewell In The Coming Weeks (updated)