A year back, a New York based journalist used to mention the most viewed content posts of Facebook everyday in response to which Facebook released the report of most popular content of Facebook. Ideal personalities and Pages frequently lead the lists, giving the appearance that Facebook promotes this type of content explicitly through its algorithms.
Because Facebook was dissatisfied with this portrayal, it first dismissed the CrowdTangle team over a disagreement on which content the app should showcase. The company subsequently released its new, more favorable report, relying on more relevant statistics, which it promised to disclose after every three months to prove the transparency. It seems great, and it will be much better until people have a better understanding of what's going on. However, the study as a whole doesn't truly elucidate or disprove anything.
Facebook's data depicts that news content is not important to get featured in the app. Instead, posts from family and social circles become more prominent as they don’t depict much, these posts include either a shared content or any review. This brings us to the main point of the article that in the latest Widely Viewed Content Report, Meta revealed that the trending content was swinging around recipe, spam, and other unnecessary themes but not news.
In the latest report published by Meta, there are no major changes but a single exception. In the statistics which Meta is employing to claim that Facebook stays far behind negative influence, the first page in the list is instead blacklisted by Meta for breaking the rules and regulations. It's not a good sign, especially because the remainder of the report's entries show that trash, nonsense, and strange URLs also acquired a huge popularity during that time.
Indeed, this recent revelation highlights worries about Facebook's partitioning, since a Page that Meta recognized as posting problematic content got a lot of popularity in the app before being terminated. It is important to mention that the report includes the data of three months which indicates that it is probably unlikely to see any news update in the list as news only gets attention on any special event or day.
In general, Meta's reasoning has a lot of weaknesses in it, which gives a variety of opportunities for judgment. And it's pointless to claim that Facebook's strategy doesn't really encourage dividing, controversial messages, because the platform's system is designed to increase engagement and keep people engaged in order to keep them in the app.
Read next: Meta has launched a self-supervised model for accurate object identification, to up its augmented reality game
Because Facebook was dissatisfied with this portrayal, it first dismissed the CrowdTangle team over a disagreement on which content the app should showcase. The company subsequently released its new, more favorable report, relying on more relevant statistics, which it promised to disclose after every three months to prove the transparency. It seems great, and it will be much better until people have a better understanding of what's going on. However, the study as a whole doesn't truly elucidate or disprove anything.
Facebook's data depicts that news content is not important to get featured in the app. Instead, posts from family and social circles become more prominent as they don’t depict much, these posts include either a shared content or any review. This brings us to the main point of the article that in the latest Widely Viewed Content Report, Meta revealed that the trending content was swinging around recipe, spam, and other unnecessary themes but not news.
In the latest report published by Meta, there are no major changes but a single exception. In the statistics which Meta is employing to claim that Facebook stays far behind negative influence, the first page in the list is instead blacklisted by Meta for breaking the rules and regulations. It's not a good sign, especially because the remainder of the report's entries show that trash, nonsense, and strange URLs also acquired a huge popularity during that time.
Indeed, this recent revelation highlights worries about Facebook's partitioning, since a Page that Meta recognized as posting problematic content got a lot of popularity in the app before being terminated. It is important to mention that the report includes the data of three months which indicates that it is probably unlikely to see any news update in the list as news only gets attention on any special event or day.
In general, Meta's reasoning has a lot of weaknesses in it, which gives a variety of opportunities for judgment. And it's pointless to claim that Facebook's strategy doesn't really encourage dividing, controversial messages, because the platform's system is designed to increase engagement and keep people engaged in order to keep them in the app.
Read next: Meta has launched a self-supervised model for accurate object identification, to up its augmented reality game