While YouTube has definitely revolutionized the world of entertainment, completely changing the way we approach any given content that we may be looking into, it definitely has a problem. This problem is the fact that all of the content it uploads has a comment section, and while this comment section is genuinely used for positive discussion, engagement with creators as well as building a sense of community among the various users that that may follow or subscribe to a particular creator, it’s fair to say that a number of YouTubers misuse the comment section quite a bit.
They essentially do this by abusing people, making inappropriate remarks as well as potentially endangering the content creators or any users that may be interacting with these creators. YouTube has been making a lot of changes to the comment sections recently by promoting comments from paid subscribers to channels, top fans as well as the creators themselves, but the most recent change that is about to roll out could be the biggest contributor to making comment sections a bit more wholesome all in all.
This new feature has a lot to do with comment moderation. Basically if you upload a video to YouTube, you would be able to switch on a feature that would automatically hold comments that are deemed inappropriate so that you can review them. These comments would not be posted until and unless you check them out and find that they do not violate your personal rules for your own comment section.
Now, a lot of creators simply won’t have the time to go through all of the comments that this feature flags, but for certain creators it may just be a real godsend. For example, if a particular YouTuber works with children, has children on their show or involves content that is targeting children, the need to moderate this comment section will end up becoming far more important than ever before.
This is because of the fact that certain morally suspect people might make inappropriate comments that children should not be reading. A lot of these remarks may also end up being downright predatory which is another thing that you would ideally not want to end up dealing with. Hence, turning this filter on is going to really improve your chances of keeping your comment section clean.
Read next: This new sticker for YouTube Creators on Stories lets viewers add videos to their Watch Later playlist
Featured photo: SOPA Images via Getty Images
They essentially do this by abusing people, making inappropriate remarks as well as potentially endangering the content creators or any users that may be interacting with these creators. YouTube has been making a lot of changes to the comment sections recently by promoting comments from paid subscribers to channels, top fans as well as the creators themselves, but the most recent change that is about to roll out could be the biggest contributor to making comment sections a bit more wholesome all in all.
This new feature has a lot to do with comment moderation. Basically if you upload a video to YouTube, you would be able to switch on a feature that would automatically hold comments that are deemed inappropriate so that you can review them. These comments would not be posted until and unless you check them out and find that they do not violate your personal rules for your own comment section.
Now, a lot of creators simply won’t have the time to go through all of the comments that this feature flags, but for certain creators it may just be a real godsend. For example, if a particular YouTuber works with children, has children on their show or involves content that is targeting children, the need to moderate this comment section will end up becoming far more important than ever before.
This is because of the fact that certain morally suspect people might make inappropriate comments that children should not be reading. A lot of these remarks may also end up being downright predatory which is another thing that you would ideally not want to end up dealing with. Hence, turning this filter on is going to really improve your chances of keeping your comment section clean.
"We're continuing to roll out the optional setting “Hold potentially inappropriate comments for review” as the default to more channels. We hope this makes it easier for you to own and manage conversations on your channel!.", announced Jordan from TeamYouTube in community post. Adding further, "When we first launched this feature, channels that had this setting enabled saw a 75% drop in comments being reported by viewers."The great thing is that this will make the job of the creator much easier. It can be done by hiring a professional comments moderator or by applying a scheduled routine to go through these comments rather than to let all comments in and see the terrible situations that often arise when you (as a creator) don’t moderate your own comment sections.
Read next: This new sticker for YouTube Creators on Stories lets viewers add videos to their Watch Later playlist
Featured photo: SOPA Images via Getty Images