Wikipedia's founder, Jimmy Wales, recently suggested that Twitter should employ a volunteer force for monitoring controversial posts on the platform.
Twitter's problems with curbing hate speech and misinformation on the platform are nearly as old as the platform's inception. Since the 2010's, the short-form social media site has been barraged with an endless list of accusations of harboring conspiracy theorist, anti-vaxxers and any number of controversial stances in between. The company's response has been to implement any and all possible strategies short of mass banning. There's been content flagging, banning specific hate-centric users (for example Donald Trump), adding authentic sources of information to all COVID-related posts. However, this strategy of tossing everything at the wall to see what sticks has provided mixed results, at best.
The problem with social networks such as Twitter is that moderation at a large scale is kind of cumbersome. As stated by Jimmy Wales, any fight against troll groups that only involves company employees and algorithms will result in failure. Such attempts at crowd control are unsuccessful due to their detachment from the community as a whole. Devs don't have the time to nail down individuals troll groups, since they won't necessarily encounter them in the first place. Algorithms can pin down key terms and phrases, but can't necessarily adapt to them being changed. A volunteer force would help circumvent much of these problems.
Wikipedia itself relies on volunteers for their service. 5,000 out of their force of 80,000 editors is comprised of community members that are very active, and therefore, volunteer their own service. This helps reduce some load from the admin, since Wikipedia is a website that receives around 2 billion visits per month, and could honestly use all of the extra help they need. Jimmy Wales is of the opinion that sites such as Twitter could benefit from the same structure.
There are some pros and cons to this. The pros are obvious: having some ground level volunteers will help in both identifying and curbing the likes of racism and misogyny encountered oh so frequently online. However, hiring a volunteer force is always a bit of a sketchy area. Proper background tests must be run, or members must have a long and active history on Twitter to draw from. Otherwise, any troll could sneak on board and cause mayhem. The other issue is one of payment. Volunteer implies that the people putting in the work will receive no financial compensation of any sort. If this was some small business or non-profit, things would be different. But for a platform such as Twitter, worth millions, paying those who work for you becomes a responsibility.
At any rate, suggestions and opinions have been offered plenty. It's now time for Twitter to make up its own mind and take more active steps towards making the community a safer and healthier one.
H/T: TG.
Read next: Hybrid Work Could Result in Increased Employee Burnout According to Microsoft Study
Twitter's problems with curbing hate speech and misinformation on the platform are nearly as old as the platform's inception. Since the 2010's, the short-form social media site has been barraged with an endless list of accusations of harboring conspiracy theorist, anti-vaxxers and any number of controversial stances in between. The company's response has been to implement any and all possible strategies short of mass banning. There's been content flagging, banning specific hate-centric users (for example Donald Trump), adding authentic sources of information to all COVID-related posts. However, this strategy of tossing everything at the wall to see what sticks has provided mixed results, at best.
The problem with social networks such as Twitter is that moderation at a large scale is kind of cumbersome. As stated by Jimmy Wales, any fight against troll groups that only involves company employees and algorithms will result in failure. Such attempts at crowd control are unsuccessful due to their detachment from the community as a whole. Devs don't have the time to nail down individuals troll groups, since they won't necessarily encounter them in the first place. Algorithms can pin down key terms and phrases, but can't necessarily adapt to them being changed. A volunteer force would help circumvent much of these problems.
Wikipedia itself relies on volunteers for their service. 5,000 out of their force of 80,000 editors is comprised of community members that are very active, and therefore, volunteer their own service. This helps reduce some load from the admin, since Wikipedia is a website that receives around 2 billion visits per month, and could honestly use all of the extra help they need. Jimmy Wales is of the opinion that sites such as Twitter could benefit from the same structure.
There are some pros and cons to this. The pros are obvious: having some ground level volunteers will help in both identifying and curbing the likes of racism and misogyny encountered oh so frequently online. However, hiring a volunteer force is always a bit of a sketchy area. Proper background tests must be run, or members must have a long and active history on Twitter to draw from. Otherwise, any troll could sneak on board and cause mayhem. The other issue is one of payment. Volunteer implies that the people putting in the work will receive no financial compensation of any sort. If this was some small business or non-profit, things would be different. But for a platform such as Twitter, worth millions, paying those who work for you becomes a responsibility.
At any rate, suggestions and opinions have been offered plenty. It's now time for Twitter to make up its own mind and take more active steps towards making the community a safer and healthier one.
H/T: TG.
Read next: Hybrid Work Could Result in Increased Employee Burnout According to Microsoft Study