The Oversight Board from Meta is definitely one useful feature that keeps a check on Facebook. And it’s quite enlightening to see how Meta really takes the board’s decisions into consideration.
Therefore, recently, we saw Meta’s Oversight Board issued a warning to Facebook to be more careful when using its moderation tools for the content seen on the app. The news came after the platform eliminated a cartoon showcasing the violent tensions among the police in Columbia.
The matter arose when the board was reviewing other cases linked to controversial content like the matter of assault in India. For those who may not know, the board is funded by the tech giant, but it’s more or less like a semi-independent body.
Therefore, the cartoon in question from Columbia up for review depicts the police using batons to beat protesters up and put a very serious political scenario on display.
It was flagged by users of the platform who felt that the content wasn’t something they’d like to see, and with time, they didn’t like it. Therefore, it was removed.
But those who put it up appealed the decision, and it was sent to Meta’s Oversight Board. And they won. The Board added that out of a staggering 200+ appeals, 98% were successful. Thankfully, the content was not deleted from the app’s database as they waited for the pending verdict from the board. Now that it’s out, the cartoon can be seen on the app again.
This was a wake-up call for the Board. They feel Meta’s reliance on automated tools and systems to delete content can’t be 100% correct. And so many incorrect decisions can arise as a result of that. And it just amplifies the chances of wrong decisions and people getting upset.
They also concluded that if some other system or tool came into play, it would have been better, like a responsive system that triggers reviews when questionable images come into play. If that were not the case, images would be banned left and right for a while, even if the decision is up for review much later.
This is when we witnessed the Oversight Board’s many cases that continue to question if the app’s system is designed in a manner to remove images aggressively, even when not required.
There is a lot of thought on the subject of Meta failing to measure rightly the accuracy of the data provided by Media Matching Service banks. And without such information, the firm fails to judge if the technology is even working adequately for some standards in the community against others.
Therefore, now, the board is requesting Meta to unveil the error rates that occur when content is included in matching banks by mistake. These banks are where data that’s controversial is added after being flagged by the app.
Meta has been provided with plenty of time to answer the board’s recommendations and thoughts on the matter. But the final decision rests with the firm at the end of the day.
After that, we saw more information be released regarding the Oversight Board addressing several incidents where Facebook drew the line on the controversial extremist group topic. The decision comes after Meta reportedly made an error in removing a post in the Urdu language that spoke about the Taliban opening up schools in the country for both females and young kids.
Despite the appeals, it never got the chance to be reviewed. So as you can see, the loopholes are plenty. And we hope Meta can try and figure out a solution before it’s too late.
Read next: YouTube Fans Upset After App Forces Nearly Dozen ‘Unskippable’ Ads Simultaneously
Therefore, recently, we saw Meta’s Oversight Board issued a warning to Facebook to be more careful when using its moderation tools for the content seen on the app. The news came after the platform eliminated a cartoon showcasing the violent tensions among the police in Columbia.
The matter arose when the board was reviewing other cases linked to controversial content like the matter of assault in India. For those who may not know, the board is funded by the tech giant, but it’s more or less like a semi-independent body.
Therefore, the cartoon in question from Columbia up for review depicts the police using batons to beat protesters up and put a very serious political scenario on display.
It was flagged by users of the platform who felt that the content wasn’t something they’d like to see, and with time, they didn’t like it. Therefore, it was removed.
But those who put it up appealed the decision, and it was sent to Meta’s Oversight Board. And they won. The Board added that out of a staggering 200+ appeals, 98% were successful. Thankfully, the content was not deleted from the app’s database as they waited for the pending verdict from the board. Now that it’s out, the cartoon can be seen on the app again.
This was a wake-up call for the Board. They feel Meta’s reliance on automated tools and systems to delete content can’t be 100% correct. And so many incorrect decisions can arise as a result of that. And it just amplifies the chances of wrong decisions and people getting upset.
They also concluded that if some other system or tool came into play, it would have been better, like a responsive system that triggers reviews when questionable images come into play. If that were not the case, images would be banned left and right for a while, even if the decision is up for review much later.
This is when we witnessed the Oversight Board’s many cases that continue to question if the app’s system is designed in a manner to remove images aggressively, even when not required.
There is a lot of thought on the subject of Meta failing to measure rightly the accuracy of the data provided by Media Matching Service banks. And without such information, the firm fails to judge if the technology is even working adequately for some standards in the community against others.
Therefore, now, the board is requesting Meta to unveil the error rates that occur when content is included in matching banks by mistake. These banks are where data that’s controversial is added after being flagged by the app.
Meta has been provided with plenty of time to answer the board’s recommendations and thoughts on the matter. But the final decision rests with the firm at the end of the day.
After that, we saw more information be released regarding the Oversight Board addressing several incidents where Facebook drew the line on the controversial extremist group topic. The decision comes after Meta reportedly made an error in removing a post in the Urdu language that spoke about the Taliban opening up schools in the country for both females and young kids.
Despite the appeals, it never got the chance to be reviewed. So as you can see, the loopholes are plenty. And we hope Meta can try and figure out a solution before it’s too late.
Read next: YouTube Fans Upset After App Forces Nearly Dozen ‘Unskippable’ Ads Simultaneously