Facebook since a long time has ruled the hearts of many internet users, the platform since its launch has stood out from all other competitors and have managed to a large user base. Facebook as of now is the social platform with most number of users, while also being the most used social platform out there. It has been decades since the platform was first launched, it has still not stopped with the growth and keeps growing. While Facebook sure does looks like a platform that since its release have always seen successes and kept growing, however that is not exactly the case. Facebook since its foundation has been running into hundreds of problems, almost all of these problems are related to things such as policies regarding user privacy or as posts. With increased user awareness, being in between a bunch of controversial data privacy, misinformation and other such cases has become pretty common for social giant Facebook and other platforms nowadays.
A recent example of this is when Facebook got a large amount of criticism from people claiming that its platform’s algorithm helps socially and politically divisive news and posts to gain more traction. These claims came after users noticed that content made to misinform and divide people was gaining more likes, shares and engagement.
This point was further proven through a test by New York times journalist Kevin Roose in November last year, Kevin in the test shared Facebook posts with the most engagement each day. Kevin took out the data for his list by accessing Facebook’s own data, which further proves the authenticity of his list.
Most of the top engaged (posts with links) on Kevin’s list were dominated by pages that several uploaded divisive and politically harsh content. These claims alongside Kevin’s report were a hard blow to Facebook and its algorithm.
Facebook following the claims have been working hard to prove them wrong, well it did so in its new quarterly report. Facebook in the quarterly report released an update named ‘Widely Viewed content’. The aim of this update was to break down the platform’s algorithm and explain how it doesn’t provide traction to divisive content or posts.
Facebook in the report shared insights that showcased the source of all content being viewed by Facebook users. The data revealed that majority of the content that people see on their newsfeed are from trusted sources such as from friends and family (57 percent), liked pages (14.6 percent) and groups (19.3 percent), whereas only 9.4 percent of it is from unknown/unconnected sources.
The company in its report then agreed that though some people might claim that political and divisive content might cover most of the news feed, it is not always true. It said that while people receive over 90 percent of their news feed content from trusted sources, which is why Facebook is not at fault in this.
Facebook then referring to the claims in Kevin’s list explained the difference between engagement and views saying that distinction between the two are very important. Facebook said that Kevin’s list focused on only one piece and that is ‘engagement’, however to correctly judge the situation people need to look at the whole puzzle. This is because political and divisive post are more likely to be liked and shared by people thus explaining their increased engagement.
This report by Facebook is surely an eye opener for people that previously considered the platform to promote political and divisive content.
Read next: Facebook's Q2 Community Standards Enforcement report shows some improvement in all of its actions including the misinformation related to the pandemic
A recent example of this is when Facebook got a large amount of criticism from people claiming that its platform’s algorithm helps socially and politically divisive news and posts to gain more traction. These claims came after users noticed that content made to misinform and divide people was gaining more likes, shares and engagement.
This point was further proven through a test by New York times journalist Kevin Roose in November last year, Kevin in the test shared Facebook posts with the most engagement each day. Kevin took out the data for his list by accessing Facebook’s own data, which further proves the authenticity of his list.
Most of the top engaged (posts with links) on Kevin’s list were dominated by pages that several uploaded divisive and politically harsh content. These claims alongside Kevin’s report were a hard blow to Facebook and its algorithm.
Facebook following the claims have been working hard to prove them wrong, well it did so in its new quarterly report. Facebook in the quarterly report released an update named ‘Widely Viewed content’. The aim of this update was to break down the platform’s algorithm and explain how it doesn’t provide traction to divisive content or posts.
Facebook in the report shared insights that showcased the source of all content being viewed by Facebook users. The data revealed that majority of the content that people see on their newsfeed are from trusted sources such as from friends and family (57 percent), liked pages (14.6 percent) and groups (19.3 percent), whereas only 9.4 percent of it is from unknown/unconnected sources.
The company in its report then agreed that though some people might claim that political and divisive content might cover most of the news feed, it is not always true. It said that while people receive over 90 percent of their news feed content from trusted sources, which is why Facebook is not at fault in this.
Facebook then referring to the claims in Kevin’s list explained the difference between engagement and views saying that distinction between the two are very important. Facebook said that Kevin’s list focused on only one piece and that is ‘engagement’, however to correctly judge the situation people need to look at the whole puzzle. This is because political and divisive post are more likely to be liked and shared by people thus explaining their increased engagement.
This report by Facebook is surely an eye opener for people that previously considered the platform to promote political and divisive content.
Read next: Facebook's Q2 Community Standards Enforcement report shows some improvement in all of its actions including the misinformation related to the pandemic