According to one of the latest articles released by Gizmodo, social media giant Meta is under the spotlight for being the largest platform responsible for hosting COVID-19 misinformation.
At the start of the pandemic, the company knew exactly what role it would play for the public by sharing its virus theories. A few months before COVID-19 made its first appearance in the US, Monika Bickert, head of global policy management, shared the strategy Facebook came up with to curtail the deceptive ideas being shared regarding vaccination programs. However, though the company seemed to be serious with their policy, instead of taking the misinformation down, they simply took it down from the feed so it doesn’t appear on the top. If only Facebook had taken serious actions at that time, it could’ve prevented itself from what was about to happen 10 months later.
According to Gizmodo’s report, Facebook should have taken initiatives regarding the spread of hoax information. Just two months before COVID, infectious disease management issued a warning that the infamous measles was on its way back to New York. Despite having proper vaccines ready to combat the virus, campaigns started to take place on Meta, which were fueled by the Antivax community.
Instead of using its power against such campaigns, Facebook supported the free speech narrative in its way by not taking the campaigns down but removing them from being at the top of the news feed and recommendations.
Facebook offered its platform to health authorities to be used for free advertising but the offer was rejected by the respective authorities because they had already seen how comments under posts highlighting the importance of vaccines were filled with misinformation and lies and because users were more interested in what the general public's opinion had about vaccines, WHO realized that using such platforms wouldn't be beneficial.
In light of the documents exposed by Frances Haugem, a former Facebook product manager, it was clear that Meta employees were aware of the extent to which misguided data was being shared on the platform in the name of free speech. These employees understood the potential this faked information had to cause a crisis in a community. Misinformation campaigns saw an increase in activity and followers as the mortality rate reached 100,000.
As a result of all of these events, it was clear that the strategy shared by Monika Bickert was failing, since misinformation was getting more fame than the actual truth, and because of the extended air time, millions of U.S citizens became part of this hoax campaign. Instead of allowing such data to surface on the platform, the cross-functionality team advised the company that it would be better if it was kept out of sight without taking it down. If a post had to be removed permanently, it should have been manually reviewed first.
It was observed on Meta that a post talking about the importance of vaccines would have more anti-vax replies and that 20% of the comments in favor of vaccines would be considered flagged. Such comments are mostly available on the platform.
Another thing observed was that despite Facebook's trying to demote such posts, if a user had shared any such post a year before, it’d still be available in their memories, hence giving rise to more misinformation. Based on the number of people who came out against vaccines, which led to the deaths of millions of U.S. citizens, it is proof that Meta blew its opportunity to restrict such activities.
Read next: Amazon Is All Set To Get Rid Of Fake Reviews On The Website By Filing Cases Against The Ones Who Sell Them
At the start of the pandemic, the company knew exactly what role it would play for the public by sharing its virus theories. A few months before COVID-19 made its first appearance in the US, Monika Bickert, head of global policy management, shared the strategy Facebook came up with to curtail the deceptive ideas being shared regarding vaccination programs. However, though the company seemed to be serious with their policy, instead of taking the misinformation down, they simply took it down from the feed so it doesn’t appear on the top. If only Facebook had taken serious actions at that time, it could’ve prevented itself from what was about to happen 10 months later.
According to Gizmodo’s report, Facebook should have taken initiatives regarding the spread of hoax information. Just two months before COVID, infectious disease management issued a warning that the infamous measles was on its way back to New York. Despite having proper vaccines ready to combat the virus, campaigns started to take place on Meta, which were fueled by the Antivax community.
Instead of using its power against such campaigns, Facebook supported the free speech narrative in its way by not taking the campaigns down but removing them from being at the top of the news feed and recommendations.
Facebook offered its platform to health authorities to be used for free advertising but the offer was rejected by the respective authorities because they had already seen how comments under posts highlighting the importance of vaccines were filled with misinformation and lies and because users were more interested in what the general public's opinion had about vaccines, WHO realized that using such platforms wouldn't be beneficial.
In light of the documents exposed by Frances Haugem, a former Facebook product manager, it was clear that Meta employees were aware of the extent to which misguided data was being shared on the platform in the name of free speech. These employees understood the potential this faked information had to cause a crisis in a community. Misinformation campaigns saw an increase in activity and followers as the mortality rate reached 100,000.
As a result of all of these events, it was clear that the strategy shared by Monika Bickert was failing, since misinformation was getting more fame than the actual truth, and because of the extended air time, millions of U.S citizens became part of this hoax campaign. Instead of allowing such data to surface on the platform, the cross-functionality team advised the company that it would be better if it was kept out of sight without taking it down. If a post had to be removed permanently, it should have been manually reviewed first.
It was observed on Meta that a post talking about the importance of vaccines would have more anti-vax replies and that 20% of the comments in favor of vaccines would be considered flagged. Such comments are mostly available on the platform.
Another thing observed was that despite Facebook's trying to demote such posts, if a user had shared any such post a year before, it’d still be available in their memories, hence giving rise to more misinformation. Based on the number of people who came out against vaccines, which led to the deaths of millions of U.S. citizens, it is proof that Meta blew its opportunity to restrict such activities.
Read next: Amazon Is All Set To Get Rid Of Fake Reviews On The Website By Filing Cases Against The Ones Who Sell Them