YouTube has recently announced a change in its policy regarding videos which focus on minors. It has decided to remove all the violent and mature videos from its platform if they are targeting children. The theme of the videos will be identified through the video’s title, description, and the tags.
Previously, YouTube had age-restriction for these videos, but now they have decided to not allow such videos on their app and site. They plan to be more vigilant about their policies and have more intense regulations on the content that is posted on YouTube.
The change in policy was announced on YouTube Help community forum last week but failed to be noticed by many. YouTube says that it will implement this new policy after 30 days of the announcement, to give creators enough time to be acquainted with the new policy.
During these 30 days, YouTube will not hand out strikes to channels but will remove videos that violate the new rules. Moreover, it will not give strikes to videos uploaded before the change in policy was announced, but might remove the videos. Sarah from YouTube Team have advised channels to go through YouTube kids guidelines to be more informed about how to reach the right audience and be careful with the tags and descriptions, so their videos do not get removed unnecessarily. YouTube has also decided to age-restrict more of their content, like adult cartoons, so it does not reach children.
YouTube has struggled to make its platform children friendly for years, but the problem has become more prominent in recent years. In the last six months, regulation of the content of online platforms has increased, and YouTube got entangled into an ongoing Federal Trade Commission investigation for its inability to moderate its videos. It failed to avoid videos which manipulate, exploit, or harm kids and for a potential threat to the privacy of the users.
Critics have blamed YouTube’s profoundly defective algorithm for being caught in this controversy. According to them, the algorithm does not consider the nature of the content when generating recommendations for the user; as a result, it starts suggesting videos that might contain extremist, violent, and exploitative content. Another thing that aids to this ineffectiveness are that videos targeted towards kids do well on the platform, so the company is more rewarding towards videos, which include tags and descriptions focusing on kids.
In June, the company refused to stop recommending videos featuring children even after finding out that pedophiles, who exploited such content were using them. This decision by YouTube was explicitly highlighted in the FTC’s investigation, but a settlement was reached in July. According to a report, it might be under review in the Justice Department. The company has been repeatedly called out for taking inappropriate actions for regulating and moderating the platform for kids.
Although YouTube has created a children-friendly app, YouTube Kids, to form a safer space for kids, the company still got caught up in controversies. Apart from the pedophile issue, this year the company also had to face other controversies like Elsagate, in which anonymous and almost impossible to track content creators were producing disturbing and copyright-infringing videos featuring deformed versions of Disney and Marvel character.
Photo: SASCHA STEINBACH / EPA-EFE/ REX / Shutterstock
Previously, YouTube has taken negligent actions to regulate content on its platform, including dealing with dozens of offensive videos, mostly because news organizations were cautioning YouTube’s communications team when speaking to them on the issues. Despite having rules and regulations to create a safer online space for kids, YouTube repeatedly failed in implementing and updating them properly, which lead to a series of fiascos like Elsagate.
Since last year, the company’s CEO Susan Wojcicki became more involved and responsive towards the issues raised by the lawmakers and the backlash from the public. In February, the company said that it was “aggressively approaching” the exploitation of the children issue. Also, Google, the parent company of YouTube, has hinted that they might shift all the content featuring kids to YouTube Kids. However, this specific app for kids has faced its share of issues and controversies.
YouTube has also taken vigilante actions against allowing live streams if kids are involved in it. It is also disabling comment sections on videos featuring children. The company still has not decided if it wants to turn off recommendations algorithm because it will shrink engagements. Bloomberg has reported that YouTube is devising a plan to remove targeted advertising on videos featuring children on its leading site to avoid potential fines under the Children’s Online Privacy Protection Act.
Read next: Teens are Relying on YouTubers to get their dose of Latest News and it's Quite Concerning!
Previously, YouTube had age-restriction for these videos, but now they have decided to not allow such videos on their app and site. They plan to be more vigilant about their policies and have more intense regulations on the content that is posted on YouTube.
The change in policy was announced on YouTube Help community forum last week but failed to be noticed by many. YouTube says that it will implement this new policy after 30 days of the announcement, to give creators enough time to be acquainted with the new policy.
During these 30 days, YouTube will not hand out strikes to channels but will remove videos that violate the new rules. Moreover, it will not give strikes to videos uploaded before the change in policy was announced, but might remove the videos. Sarah from YouTube Team have advised channels to go through YouTube kids guidelines to be more informed about how to reach the right audience and be careful with the tags and descriptions, so their videos do not get removed unnecessarily. YouTube has also decided to age-restrict more of their content, like adult cartoons, so it does not reach children.
"Protecting minors and families from inappropriate content is always a top priority which is why we’ve expanded our child safety policies. We are constantly taking steps to protect minors on YouTube, but we still recommend parents use YouTube Kids if they plan to allow kids under 13 to watch independently.", explained YouTube team.YouTube gave examples of offensive contents. It included that videos with “for children” tag and family-friendly content cannot have violent or disturbing scenes like “injecting needles.” YouTube has also prohibited children-oriented content from having adult themes like sex, violence, and death.
YouTube has struggled to make its platform children friendly for years, but the problem has become more prominent in recent years. In the last six months, regulation of the content of online platforms has increased, and YouTube got entangled into an ongoing Federal Trade Commission investigation for its inability to moderate its videos. It failed to avoid videos which manipulate, exploit, or harm kids and for a potential threat to the privacy of the users.
Critics have blamed YouTube’s profoundly defective algorithm for being caught in this controversy. According to them, the algorithm does not consider the nature of the content when generating recommendations for the user; as a result, it starts suggesting videos that might contain extremist, violent, and exploitative content. Another thing that aids to this ineffectiveness are that videos targeted towards kids do well on the platform, so the company is more rewarding towards videos, which include tags and descriptions focusing on kids.
In June, the company refused to stop recommending videos featuring children even after finding out that pedophiles, who exploited such content were using them. This decision by YouTube was explicitly highlighted in the FTC’s investigation, but a settlement was reached in July. According to a report, it might be under review in the Justice Department. The company has been repeatedly called out for taking inappropriate actions for regulating and moderating the platform for kids.
Although YouTube has created a children-friendly app, YouTube Kids, to form a safer space for kids, the company still got caught up in controversies. Apart from the pedophile issue, this year the company also had to face other controversies like Elsagate, in which anonymous and almost impossible to track content creators were producing disturbing and copyright-infringing videos featuring deformed versions of Disney and Marvel character.
Photo: SASCHA STEINBACH / EPA-EFE/ REX / Shutterstock
Previously, YouTube has taken negligent actions to regulate content on its platform, including dealing with dozens of offensive videos, mostly because news organizations were cautioning YouTube’s communications team when speaking to them on the issues. Despite having rules and regulations to create a safer online space for kids, YouTube repeatedly failed in implementing and updating them properly, which lead to a series of fiascos like Elsagate.
- Also read: YouTube Music Is Looking to Give Tough Time to Music Apps by Adding Library Sorting Feature
Since last year, the company’s CEO Susan Wojcicki became more involved and responsive towards the issues raised by the lawmakers and the backlash from the public. In February, the company said that it was “aggressively approaching” the exploitation of the children issue. Also, Google, the parent company of YouTube, has hinted that they might shift all the content featuring kids to YouTube Kids. However, this specific app for kids has faced its share of issues and controversies.
YouTube has also taken vigilante actions against allowing live streams if kids are involved in it. It is also disabling comment sections on videos featuring children. The company still has not decided if it wants to turn off recommendations algorithm because it will shrink engagements. Bloomberg has reported that YouTube is devising a plan to remove targeted advertising on videos featuring children on its leading site to avoid potential fines under the Children’s Online Privacy Protection Act.
Read next: Teens are Relying on YouTubers to get their dose of Latest News and it's Quite Concerning!