You can now report or be informed of all extremist activities on Facebook.
Facebook has been a breeding ground for all kinds of anomalies including misinformation and extremism for while now as especially confirmed by the HECC democrats through a letter. They mentioned how the platform pays little attention to all the misdemeanors being conducted.
In Facebook's defense, the platform is humungous with multiple activities being conducted so keeping track is hard. However, Facebook's latest initiative is working to improve its condition by offering resources against extremism. The platform was also seen taking measures against users spreading misinformation a while back. Furthermore, it is currently being investigated for the spread of misinformation on Covid-19 by Attorney General Karl Racine.
Currently being tested in the US where the main platform is, Facebook has rolled out yet another feature, being tested as you speak, that is designed to combat violent extremism being conducted on the site.
There are two ways to tackle this situation. The first way is to report the user yourself. If you find someone spreading extreme content, you can access the notification that questions users if they are concerned that someone they know might be turning into an extremist. The notification also states how others who have reported have gained support confidentially through the platform. There is also a 'get support' option underneath the notification where you'll be directed to the support page where further measures will be taken.
The second way is Facebook itself notifying users when they come in contact with extremism. According to a user, the statement provided by Facebook claims that violent groups take advantage of users through manipulating their anger and disappointment which can now be avoided through taking action for one’s safety.
The test is currently limited to only a single region and being tested out on educational experts and nonprofit organizations but Facebook hopes to roll out the feature globally to ensure the trust and safety of every single user.
Since a lot of people have already been exposed to such extreme content, or are future victims of extremism, it was about time Facebook took some kind of measure to avoid such an outcome. The platform argues that the test is a way to ensure that users have enough resources accessible to provide support to themselves and people they know.
Online trolling and the spread of hate is real along with extremist parties that manipulate users into conducting regrettable actions. Among the suicide victims from last year, 59% of the people were victims of aggression through social media or cyberbullying. In 2013, nine teens committed suicide due to anonymous hate comments they received through platforms including Facebook.
To prevent corresponding accidents from occurring, we cannot wait for Facebook to wipe out all kinds of extremism and hopefully soon enough, misinformation as well.
H/T: @KitOConnell
Read next: Lawmaker Points Finger At Facebook For Vaccine Misinformation
Facebook has been a breeding ground for all kinds of anomalies including misinformation and extremism for while now as especially confirmed by the HECC democrats through a letter. They mentioned how the platform pays little attention to all the misdemeanors being conducted.
In Facebook's defense, the platform is humungous with multiple activities being conducted so keeping track is hard. However, Facebook's latest initiative is working to improve its condition by offering resources against extremism. The platform was also seen taking measures against users spreading misinformation a while back. Furthermore, it is currently being investigated for the spread of misinformation on Covid-19 by Attorney General Karl Racine.
Currently being tested in the US where the main platform is, Facebook has rolled out yet another feature, being tested as you speak, that is designed to combat violent extremism being conducted on the site.
There are two ways to tackle this situation. The first way is to report the user yourself. If you find someone spreading extreme content, you can access the notification that questions users if they are concerned that someone they know might be turning into an extremist. The notification also states how others who have reported have gained support confidentially through the platform. There is also a 'get support' option underneath the notification where you'll be directed to the support page where further measures will be taken.
Facebook randomly sent me this notice about extremism when I clicked over to the app. Pretty weird... The Get Support button just goes to a short article asking people not to be hateful.
— Kit O'Connell 😷🌈 (@KitOConnell) July 1, 2021
After enabling genocide, this feels like a bandaid on a festering open wound. pic.twitter.com/BsPFFZatx8
NEW - Looks like Facebook's AI is now starting to "think" about your exposure to "harmful extremist content" and your "extremist" friends. pic.twitter.com/AVki59XzLL
— Disclose.tv 🚨 (@disclosetv) July 1, 2021
The second way is Facebook itself notifying users when they come in contact with extremism. According to a user, the statement provided by Facebook claims that violent groups take advantage of users through manipulating their anger and disappointment which can now be avoided through taking action for one’s safety.
The test is currently limited to only a single region and being tested out on educational experts and nonprofit organizations but Facebook hopes to roll out the feature globally to ensure the trust and safety of every single user.
Since a lot of people have already been exposed to such extreme content, or are future victims of extremism, it was about time Facebook took some kind of measure to avoid such an outcome. The platform argues that the test is a way to ensure that users have enough resources accessible to provide support to themselves and people they know.
Online trolling and the spread of hate is real along with extremist parties that manipulate users into conducting regrettable actions. Among the suicide victims from last year, 59% of the people were victims of aggression through social media or cyberbullying. In 2013, nine teens committed suicide due to anonymous hate comments they received through platforms including Facebook.
To prevent corresponding accidents from occurring, we cannot wait for Facebook to wipe out all kinds of extremism and hopefully soon enough, misinformation as well.
H/T: @KitOConnell
Read next: Lawmaker Points Finger At Facebook For Vaccine Misinformation