Facebook is a pretty enormous platform, so much so that it can be almost impossible to manage in a lot of different ways. Most of the time when you think about the various ways in which content is uploaded on the platform, you would end up looking at things like the average everyday content that most people tend to view on a regular basis.
However, there is a lot of very unseemly and downright disturbing content that people try to upload as well, and while there is an algorithm that tends to take content like this down, the final say is had by a group of human content moderators all of whom would need to look into a wide variety of really disturbing content.
The content that they look at often includes things like child pornography, violent imagery, hate speech as well as all kinds of other things that nobody would want to be forced to look into. Hence, it is understandable that they would develop mental illnesses such as PTSD, depression as well as substance addictions that would help them cope with their job.
Facebook got these content moderators through an outsourcing agency known as Cognizant which is now defunct after TheVerge's investigation into its questionable business practices. Because of the fact that its content moderators were outsourced, Facebook did not seem too keen on providing adequate compensation in spite of the fact that the job involves a lot of very difficult tasks that can take a toll on any sane human being.
The great news here is that one former content moderator has successfully sued Facebook, forcing the social media platform to pay over 52 million dollars in damages. Facebook has assured that every current and former content moderator will receive between $1000 to $5000 as compensation, and although some might say that this really isn’t enough to justify the work that they had to do it’s fair to say that this is at the very least a step in the right direction.
Photo: Toby Melville / Reuters
Read next: Scammers Are Openly Selling People’s Personal Data on Facebook and Twitter
However, there is a lot of very unseemly and downright disturbing content that people try to upload as well, and while there is an algorithm that tends to take content like this down, the final say is had by a group of human content moderators all of whom would need to look into a wide variety of really disturbing content.
The content that they look at often includes things like child pornography, violent imagery, hate speech as well as all kinds of other things that nobody would want to be forced to look into. Hence, it is understandable that they would develop mental illnesses such as PTSD, depression as well as substance addictions that would help them cope with their job.
Facebook got these content moderators through an outsourcing agency known as Cognizant which is now defunct after TheVerge's investigation into its questionable business practices. Because of the fact that its content moderators were outsourced, Facebook did not seem too keen on providing adequate compensation in spite of the fact that the job involves a lot of very difficult tasks that can take a toll on any sane human being.
The great news here is that one former content moderator has successfully sued Facebook, forcing the social media platform to pay over 52 million dollars in damages. Facebook has assured that every current and former content moderator will receive between $1000 to $5000 as compensation, and although some might say that this really isn’t enough to justify the work that they had to do it’s fair to say that this is at the very least a step in the right direction.
Photo: Toby Melville / Reuters
Read next: Scammers Are Openly Selling People’s Personal Data on Facebook and Twitter