One of the aims that Facebook has is, the company wants to handle and operate moderation duties with the help of artificial intelligence on its platform. Recently, the company has announced its updates regarding artificial intelligence. According to this update the company has used machine learning as in charge of its moderation queue.
Some of the basic steps by which this update works for this platform is given below:
Content and posts which are uploaded on Facebook by users that are considered to be offensive and are believed to be violating the rules and regulation of the company are flagged.
These posts are either flagged by users or by machine learning filters.
Some of the posts which are observed to be violating the company's policies are taken care of automatically while others with less damaging content are added in the queue which would later be reviewed by a human moderator.
Over 15000 of the moderator of Facebook employees are all around the world and the company has been criticized for a very long time for not providing their employees sufficient support that they needed to work properly and also, they are accused of employing their workers in the conditions which can lead them to traumatize. They are assigned to categorize the posts which are flagged and then on the base of their content decide whether they are violating the policies of the company or not.
Previously the Facebook employees as the company's moderator use to review the posts uploaded on the platform based upon chronological order and dealing with these posts concerning that order. But now the company has changed its point of view regarding that order and put more emphasis on the post which are more important and demand instant attention to be reviewed first, this whole procedure is aided by machine learning.
For the program to be more efficient in its working, in near future a combination of such machine learning algorithm would be made and used to sort the list of these flagged posts and prioritizing these posts depending upon three major factors their virality, severity, and the likelihood by which they’re breaking the company’s rules.
Some of the basic steps by which this update works for this platform is given below:
Content and posts which are uploaded on Facebook by users that are considered to be offensive and are believed to be violating the rules and regulation of the company are flagged.
These posts are either flagged by users or by machine learning filters.
Some of the posts which are observed to be violating the company's policies are taken care of automatically while others with less damaging content are added in the queue which would later be reviewed by a human moderator.
Over 15000 of the moderator of Facebook employees are all around the world and the company has been criticized for a very long time for not providing their employees sufficient support that they needed to work properly and also, they are accused of employing their workers in the conditions which can lead them to traumatize. They are assigned to categorize the posts which are flagged and then on the base of their content decide whether they are violating the policies of the company or not.
Previously the Facebook employees as the company's moderator use to review the posts uploaded on the platform based upon chronological order and dealing with these posts concerning that order. But now the company has changed its point of view regarding that order and put more emphasis on the post which are more important and demand instant attention to be reviewed first, this whole procedure is aided by machine learning.
For the program to be more efficient in its working, in near future a combination of such machine learning algorithm would be made and used to sort the list of these flagged posts and prioritizing these posts depending upon three major factors their virality, severity, and the likelihood by which they’re breaking the company’s rules.