YouTube has recently released new information that sheds further light on how often videos that violate rules and regulations get viewed on the platform. However, considering the massive size of the video streaming giant’s user base, actual numbers still prove difficult to decipher.
Recent numbers offered up by YouTube have revealed that for every 10,000 views on the platform, 16-18 of the contributing videos were later flagged for community guideline violations. This rate, approximately 0.16-0.18% of YouTube’s total views in Q4 2020, has come quite the distance considering the comparative 0.64-0.72% prevalence rate in Q4 2017. However, actually contextualizing this percentage is rather difficult when considering the fact that actual numbers on the total traffic YouTube receives in terms of views are sketchy and unconfirmed.
Approximately over 500 hours worth of content is uploaded to the public and free video service every minute, contributed by a user base of over 2 billion monthly active users. So, while the percentage seems very low in amount, it might still contribute to very high practical figures.
YouTube Team’s Jennifer O’Conner went on record to clarify that data was provided in terms of percentage as opposed to tangible numbers in order to effectively display the progress being made by developers in terms of curbing online threats to the platform. While that is an appreciable sentiment, and numbers offered in terms of percentages are easier to interpret for the general public as opposed to unprocessed numbers, it simply won’t do considering the massive size of YouTube’s library and community. With the general user base having no idea how many actual hours of content these flagged videos comprise of, or detailed numbers on their reach, the amount of harm they cause is rather difficult to gauge. So, while the decreasing amount of flagged content is a relief to observe, there very well may still be quite a ways to go before actual progress is made.
Then there’s the issue of the guideline violating content itself. Or, more particularly, the nature of those violations. While no particular information was delved into regarding the types of regulations these videos crossed, O’Conner states that YouTube’s transparency reports have already made rather clear the types of rule-breaking that are common on the platform. The biggest form of violations are child safety policies, which seems safe enough, followed by graphic or violent content, nudity, sexual content, spam, misinformation, and hate propagation.
The problem of most of the mentioned issues is that its very difficult to determine what counts as a strike, considering how much content creators on YouTube are already complaining of undue censorship on the company’s part. So, these very strikes may in fact be adversely affecting creators that are simply trying to reach more niche audiences, while missing out on more egregious acts of content violation that need to be addressed.
Read next: People Use Google, YouTube And Maps for The Best Reviews, Info and Locations During the Pandemic
Recent numbers offered up by YouTube have revealed that for every 10,000 views on the platform, 16-18 of the contributing videos were later flagged for community guideline violations. This rate, approximately 0.16-0.18% of YouTube’s total views in Q4 2020, has come quite the distance considering the comparative 0.64-0.72% prevalence rate in Q4 2017. However, actually contextualizing this percentage is rather difficult when considering the fact that actual numbers on the total traffic YouTube receives in terms of views are sketchy and unconfirmed.
Approximately over 500 hours worth of content is uploaded to the public and free video service every minute, contributed by a user base of over 2 billion monthly active users. So, while the percentage seems very low in amount, it might still contribute to very high practical figures.
YouTube Team’s Jennifer O’Conner went on record to clarify that data was provided in terms of percentage as opposed to tangible numbers in order to effectively display the progress being made by developers in terms of curbing online threats to the platform. While that is an appreciable sentiment, and numbers offered in terms of percentages are easier to interpret for the general public as opposed to unprocessed numbers, it simply won’t do considering the massive size of YouTube’s library and community. With the general user base having no idea how many actual hours of content these flagged videos comprise of, or detailed numbers on their reach, the amount of harm they cause is rather difficult to gauge. So, while the decreasing amount of flagged content is a relief to observe, there very well may still be quite a ways to go before actual progress is made.
Then there’s the issue of the guideline violating content itself. Or, more particularly, the nature of those violations. While no particular information was delved into regarding the types of regulations these videos crossed, O’Conner states that YouTube’s transparency reports have already made rather clear the types of rule-breaking that are common on the platform. The biggest form of violations are child safety policies, which seems safe enough, followed by graphic or violent content, nudity, sexual content, spam, misinformation, and hate propagation.
The problem of most of the mentioned issues is that its very difficult to determine what counts as a strike, considering how much content creators on YouTube are already complaining of undue censorship on the company’s part. So, these very strikes may in fact be adversely affecting creators that are simply trying to reach more niche audiences, while missing out on more egregious acts of content violation that need to be addressed.
Read next: People Use Google, YouTube And Maps for The Best Reviews, Info and Locations During the Pandemic