In an interview with Axios, a Facebook executive has stated that individual humans are to blame for the massive misinformation that if often encountered on social media platforms and the like.
Gee, a Facebook official blaming others for a complete and utterly profound lack of either competence and just the most basic of platform moderation? Why am I not even the least bit surprised by these developments? Maybe it’s because Facebook has a long history of attempting to shift the blame onto other third parties for its own failures. The most recent major example that can be cited is the social network actively stating that other platforms were host to the US Capitol Riots being planned out, even after multiple sources cited Facebook itself being a major contributing factor. I suppose Facebook forgot to hold other platforms to the same “it’s the individuals, not the developers” standard that it keeps adhering to for personal matters. At any rate, after so many Congressional hearings, and a slew of leaked memos from whistleblower and ex-employee Frances Haugen, one thing is clear: no matter how difficult online moderation may be, Facebook was never really trying in the first place.
Axios and HBO, both large media conglomerates in their own respective fields, have recently collaborated to make a documentary episodic show, entitled “Axios with HBO”. Each episode touches base with influential figures of our time, across a variety of relevant fields such as politics, business, and pop culture, and essentially discusses major relevant issues with them. The show’s latest guest is our afore-mentioned Facebook executive, by the possibly familiar name of Andrew Bosworth. Bosworth’s importance to both our article and Axios is only further supplemented by his being selected to be the social network’s next CTO, as soon as the next year begins. Until then, it’s dealing with interview after interview with the hopes of not screwing up. Then again, this is Facebook we’re talking about: screw ups happen, and they come with the turf.
Bosworth’s comments on the matter of online moderation are pretty standard for the issue as a whole. Statements about the inevitability of escaping online negativity, and how freedom of speech cannot be stifled occur throughout the interview. This is all, of course, besides the point. While online moderation is definitely a topic that warrants a lot of discussion, with multiple sources pitching their thoughts in, a lack of it isn’t what Facebook’s being directly blamed of. Online moderation is difficult, and no social media platform has been able to do it well. What the social network refuses to address, however, is that it’s very unwilling to even actively try.
Frances Haugen’s comments (and extensive documents) hang in the balance, completely unaddressed by Facebook and its parent company Meta, because answering them would mean admitting that this conglomerate has time and time again put profit in front of user safety. Is it impossible to avoid online hateful discourse? Yes. Is it very possible to actively curb it, or create an algorithm that hides such content, instead of actively suggesting it? I would say so. Facebook’s refusal to time and time again admit to its misdeeds is what will lead the platform to ruin.
Photo: Andrew Bosworth (Boz) / FB / Meta
Read next: Meta Builds Its New Metaverse Platform, Even As Complaints About Harassment In The VR World Get Public Attention
Gee, a Facebook official blaming others for a complete and utterly profound lack of either competence and just the most basic of platform moderation? Why am I not even the least bit surprised by these developments? Maybe it’s because Facebook has a long history of attempting to shift the blame onto other third parties for its own failures. The most recent major example that can be cited is the social network actively stating that other platforms were host to the US Capitol Riots being planned out, even after multiple sources cited Facebook itself being a major contributing factor. I suppose Facebook forgot to hold other platforms to the same “it’s the individuals, not the developers” standard that it keeps adhering to for personal matters. At any rate, after so many Congressional hearings, and a slew of leaked memos from whistleblower and ex-employee Frances Haugen, one thing is clear: no matter how difficult online moderation may be, Facebook was never really trying in the first place.
Axios and HBO, both large media conglomerates in their own respective fields, have recently collaborated to make a documentary episodic show, entitled “Axios with HBO”. Each episode touches base with influential figures of our time, across a variety of relevant fields such as politics, business, and pop culture, and essentially discusses major relevant issues with them. The show’s latest guest is our afore-mentioned Facebook executive, by the possibly familiar name of Andrew Bosworth. Bosworth’s importance to both our article and Axios is only further supplemented by his being selected to be the social network’s next CTO, as soon as the next year begins. Until then, it’s dealing with interview after interview with the hopes of not screwing up. Then again, this is Facebook we’re talking about: screw ups happen, and they come with the turf.
Bosworth’s comments on the matter of online moderation are pretty standard for the issue as a whole. Statements about the inevitability of escaping online negativity, and how freedom of speech cannot be stifled occur throughout the interview. This is all, of course, besides the point. While online moderation is definitely a topic that warrants a lot of discussion, with multiple sources pitching their thoughts in, a lack of it isn’t what Facebook’s being directly blamed of. Online moderation is difficult, and no social media platform has been able to do it well. What the social network refuses to address, however, is that it’s very unwilling to even actively try.
Frances Haugen’s comments (and extensive documents) hang in the balance, completely unaddressed by Facebook and its parent company Meta, because answering them would mean admitting that this conglomerate has time and time again put profit in front of user safety. Is it impossible to avoid online hateful discourse? Yes. Is it very possible to actively curb it, or create an algorithm that hides such content, instead of actively suggesting it? I would say so. Facebook’s refusal to time and time again admit to its misdeeds is what will lead the platform to ruin.
Photo: Andrew Bosworth (Boz) / FB / Meta
Read next: Meta Builds Its New Metaverse Platform, Even As Complaints About Harassment In The VR World Get Public Attention