While a lot of the content on YouTube is fantastic, the fact of the matter is that if you go to the vast majority of comment sections on most of these videos then you would probably realize that the people that are watching the content might not be all that nice. This often leads to a lot of viewers getting upset, and comment section arguments have become a hallmark not just of this particular site but any kind of social media based site that is currently out there on the internet.
YouTube is trying to make its comment sections places where more civil discourse can end up occurring, and the latest attempt in which the video streaming platform is trying to make this sort of thing a reality has to do with features like a new prompt that is going to encourage you to reconsider your comment if it is hostile in any way, shape or form. This prompt will come up based on an algorithm that would try to detect whether or not abusive language or anything else that might be considered hostile is contained within the comment in question.
This update is being received with a pretty mixed response all in all. Some critics agree that this is not something that would actively prevent hostile arguments from erupting, and this is an opinion that YouTube seems to share because of the fact that the video streaming platform admitted that this was not going to fix the problem entirely but they are hoping that it would give people the chance to pause and think about what they are saying. Others are saying that this is going to do more harm than good because of the fact that YouTube does not have a very good reputation when it comes to algorithms that supposedly police content.
YouTube is trying to make its comment sections places where more civil discourse can end up occurring, and the latest attempt in which the video streaming platform is trying to make this sort of thing a reality has to do with features like a new prompt that is going to encourage you to reconsider your comment if it is hostile in any way, shape or form. This prompt will come up based on an algorithm that would try to detect whether or not abusive language or anything else that might be considered hostile is contained within the comment in question.
This update is being received with a pretty mixed response all in all. Some critics agree that this is not something that would actively prevent hostile arguments from erupting, and this is an opinion that YouTube seems to share because of the fact that the video streaming platform admitted that this was not going to fix the problem entirely but they are hoping that it would give people the chance to pause and think about what they are saying. Others are saying that this is going to do more harm than good because of the fact that YouTube does not have a very good reputation when it comes to algorithms that supposedly police content.
Photo: Dado Ruvic / Reuters