Although the Related video algorithm in YouTube was aimed at helping the users find content relative to their particular music taste or other harmless preferences, it has been accused of being used for atrocious things such as leading one to conspiracy and hate-spreading content. However, nothing comes close to the most recent accusation that is “helping pedophiles in sharing and finding nerfed down porn related content”.
YouTuber Matt Watson was quick to point it out in his recent video. Watson pointed out how clips of young girls in questionable states were being viewed multiple times with the comment section full of pedophiles and predators. The worst thing about these videos is that many of them were monetized.
According to Watson, no effort should be wasted on reporting or flagging such videos. Instead, such immoral, controversial and questionable content should be deleted in the first place and YouTube should do something to set the guidelines for the creators producing age restricted content.
Even some channels uploading harmless videos of kids having fun are aware how clips from the videos can be used for immoral purposes and encourage the pedophiles visiting their videos by liking their comments or engaging in a “Comment discussion” with them.
Back in 2017, when a similar issue surfaced, YouTube, although slow to respond, assured everyone that the platform will become proactive in dealing with such kind of videos and also promised the implementation of an additional comment review functionality and addition of internal staff to manage such reviews.
Fast forward to 2019, it still remains unknown whether the above mentioned features were implemented or not. However, several videos that undoubtedly go against the platform’s community guidelines are still available and can be viewed easily.
Such kind of videos should be deleted without wasting a second. Moreover, the content uploaded by a child, which has the slightest possibility to catch the attention of a pedophile, should be immediately removed and reported to their parents. Merely disabling comments on a video isn’t the solution.
According to YouTuber Keemstar (from Drama Alert), YouTube has been proactive as of late in cleaning the platform from this mess. However, the tech giant hasn’t made a public comment on this situation yet. Only time will tell what solution does YouTube come up with and whether it is a legit one or not.
Read Next: Is YouTube the culprit behind the spread of false news such as Flat-Earth?
YouTuber Matt Watson was quick to point it out in his recent video. Watson pointed out how clips of young girls in questionable states were being viewed multiple times with the comment section full of pedophiles and predators. The worst thing about these videos is that many of them were monetized.
According to Watson, no effort should be wasted on reporting or flagging such videos. Instead, such immoral, controversial and questionable content should be deleted in the first place and YouTube should do something to set the guidelines for the creators producing age restricted content.
Even some channels uploading harmless videos of kids having fun are aware how clips from the videos can be used for immoral purposes and encourage the pedophiles visiting their videos by liking their comments or engaging in a “Comment discussion” with them.
Back in 2017, when a similar issue surfaced, YouTube, although slow to respond, assured everyone that the platform will become proactive in dealing with such kind of videos and also promised the implementation of an additional comment review functionality and addition of internal staff to manage such reviews.
Fast forward to 2019, it still remains unknown whether the above mentioned features were implemented or not. However, several videos that undoubtedly go against the platform’s community guidelines are still available and can be viewed easily.
Related: YouTube finally decides to change its community guidelines strike systemIn addition to that, videos made 5-6 years ago, which can and have intentionally or unintentionally captured the attention of child predators are still up, and what’s even more shocking is that such videos often pop up in the “Autoplay” section, making it easier for them to be viewed. Most of these videos have views ranging from a few thousands to even millions, and that is very concerning. Although their comment section has been disabled but their availability on the platform after all this time raises some serious questions about how YouTube is handling this situation.
Such kind of videos should be deleted without wasting a second. Moreover, the content uploaded by a child, which has the slightest possibility to catch the attention of a pedophile, should be immediately removed and reported to their parents. Merely disabling comments on a video isn’t the solution.
According to YouTuber Keemstar (from Drama Alert), YouTube has been proactive as of late in cleaning the platform from this mess. However, the tech giant hasn’t made a public comment on this situation yet. Only time will tell what solution does YouTube come up with and whether it is a legit one or not.
Read Next: Is YouTube the culprit behind the spread of false news such as Flat-Earth?