Over the past year or so Facebook has come under a lot of fire because it facilitated terrorist activities all across the world. This came after a widespread scandal that Facebook allowed fake news to spread that might have swung the US election a certain way. In light of all of these issues that are being raised, Facebook has started to change the way it handles the content that is being posted to its platform. The social media giant stated that it had deleted three million posts over the past quarter that had been made by terrorist groups.
While this number is significant, it is less than a third of the number that was pulled the previous quarter. However, Facebook has stated that these posts are staying up for shorter periods of time as well. Previously they were up for an average of forty-three hours before getting deleted. Now, they are only staying up for around eighteen hours on average before they are removed. One problem is that whenever a social media platform takes steps to counteract terrorist propaganda being spread through their platform, the bad actors involved end up adopting different methodologies in order to find a way around these algorithms.
Monika Bickert, global head of policy management at Facebook, and Brian Fisherman, Facebook’s head of Counterterrorism Policy, stated in a recent blog post that Facebook will work tirelessly to remove all terrorist content from its platform, but permanent change will require their real world presences to be tackled. That being said, Facebook stated that it had successfully managed to remove 99 percent of all content that was posted on its platform that was related to terrorism or terrorist related activities, thus proving that the company seems to be taking this job seriously.
"Our work to combat terrorism is not done. Terrorists come in many ideological stripes — and the most dangerous among them are deeply resilient. At Facebook, we recognize our responsibility to counter this threat and remain committed to it. But we should not view this as a problem that can be “solved” and set aside, even in the most optimistic scenarios.", said Facebook team in a “Hard Questions answered series” that addresses the impact of their products on society. Adding further, "We can reduce the presence of terrorism on mainstream social platforms, but eliminating it completely requires addressing the people and organizations that generate this material in the real-world."
While this number is significant, it is less than a third of the number that was pulled the previous quarter. However, Facebook has stated that these posts are staying up for shorter periods of time as well. Previously they were up for an average of forty-three hours before getting deleted. Now, they are only staying up for around eighteen hours on average before they are removed. One problem is that whenever a social media platform takes steps to counteract terrorist propaganda being spread through their platform, the bad actors involved end up adopting different methodologies in order to find a way around these algorithms.
Monika Bickert, global head of policy management at Facebook, and Brian Fisherman, Facebook’s head of Counterterrorism Policy, stated in a recent blog post that Facebook will work tirelessly to remove all terrorist content from its platform, but permanent change will require their real world presences to be tackled. That being said, Facebook stated that it had successfully managed to remove 99 percent of all content that was posted on its platform that was related to terrorism or terrorist related activities, thus proving that the company seems to be taking this job seriously.
"Our work to combat terrorism is not done. Terrorists come in many ideological stripes — and the most dangerous among them are deeply resilient. At Facebook, we recognize our responsibility to counter this threat and remain committed to it. But we should not view this as a problem that can be “solved” and set aside, even in the most optimistic scenarios.", said Facebook team in a “Hard Questions answered series” that addresses the impact of their products on society. Adding further, "We can reduce the presence of terrorism on mainstream social platforms, but eliminating it completely requires addressing the people and organizations that generate this material in the real-world."