Meta is going public with the latest version of its Community Standards Enforcement Overview.
The company highlighted all the violations made by users against its own policies across different apps. They also highlighted which actions had been taken in this respect.
Common offenses included spam, speech removal, fake accounts, restrictions, and more. While there were plenty that arose in Q4 of 2022, let’s take a look at some more shifts in this regard.
For starters, it was the issue of spam that really saw a massive increase in the final quarter of 2022. Around 1.8 billion posts underwent actions during this period.
Moreover, Meta claims that this may have a lot to do with the likes of spam attacks in October which continued to fluctuate and were related to enforcement metrics that were later introduced. So if you happened to have noticed a lot of unnecessary or spam content toward the end of Q4 on apps like Facebook, well, you certainly were not wrong. Let’s not forget to mention how so many of these appeals were not even removed.
As far as restricted goods are concerned, the company feels it was actioning so much more content thanks to changes made during this period. Moreover, as witnessed in such a chart, Meta deleted so much content that had to do with drugs at this time. And that is a clear reflection of the tools it has for this.
Keeping spam aside, a lot of these were related to huge removals and then were even appealed to by the likes of many users. Similarly, Meta says that on a more positive front, it saw much less violence and incitement. Content linked to self-harm and suicide also decreased which the company again gives credit to itself as it was more proactive with its surveillance.
This is so much more different than the likes of spam because it had to do with users sending out appeals against such content instead of Meta’s own regulatory teams and tools handling it.
So while we can say that Meta is detecting a lot more content and taking necessary action regarding this, it also needs to look more at the rate of prevalence at which this is all taking place. It’s really something that needs to be addressed before it's too late.
Furthermore, another concerning aspect that Meta has spoken about is linked to a rise in violations for nudity as well as explicit content. This happened more at the end of 2022.
The issue of fake accounts is also on the rise as it represents a mere 4 to 5% of the global active users on the Facebook app every month. The figure stands at 2.9 billion people, out of which 148 million are fake users. So as you can imagine, a lot more strict regulation is needed to evade this.
Obviously, Meta knows that there will always be some fake users and that cannot be removed completely but right now, the figure is just too high.
Read next: Meta Makes Major Changes To Its Penalty System That Users Call ‘Facebook Jail’
The company highlighted all the violations made by users against its own policies across different apps. They also highlighted which actions had been taken in this respect.
Common offenses included spam, speech removal, fake accounts, restrictions, and more. While there were plenty that arose in Q4 of 2022, let’s take a look at some more shifts in this regard.
For starters, it was the issue of spam that really saw a massive increase in the final quarter of 2022. Around 1.8 billion posts underwent actions during this period.
Moreover, Meta claims that this may have a lot to do with the likes of spam attacks in October which continued to fluctuate and were related to enforcement metrics that were later introduced. So if you happened to have noticed a lot of unnecessary or spam content toward the end of Q4 on apps like Facebook, well, you certainly were not wrong. Let’s not forget to mention how so many of these appeals were not even removed.
As far as restricted goods are concerned, the company feels it was actioning so much more content thanks to changes made during this period. Moreover, as witnessed in such a chart, Meta deleted so much content that had to do with drugs at this time. And that is a clear reflection of the tools it has for this.
Keeping spam aside, a lot of these were related to huge removals and then were even appealed to by the likes of many users. Similarly, Meta says that on a more positive front, it saw much less violence and incitement. Content linked to self-harm and suicide also decreased which the company again gives credit to itself as it was more proactive with its surveillance.
This is so much more different than the likes of spam because it had to do with users sending out appeals against such content instead of Meta’s own regulatory teams and tools handling it.
So while we can say that Meta is detecting a lot more content and taking necessary action regarding this, it also needs to look more at the rate of prevalence at which this is all taking place. It’s really something that needs to be addressed before it's too late.
Furthermore, another concerning aspect that Meta has spoken about is linked to a rise in violations for nudity as well as explicit content. This happened more at the end of 2022.
The issue of fake accounts is also on the rise as it represents a mere 4 to 5% of the global active users on the Facebook app every month. The figure stands at 2.9 billion people, out of which 148 million are fake users. So as you can imagine, a lot more strict regulation is needed to evade this.
Obviously, Meta knows that there will always be some fake users and that cannot be removed completely but right now, the figure is just too high.
Read next: Meta Makes Major Changes To Its Penalty System That Users Call ‘Facebook Jail’