YouTube is a space where thousands of people earn their living. People of every age group spend a good number of hours on a daily or weekly basis on YouTube. It is a source for people to get education as well as to be aware of current affairs. Whereas, kids and younger ones use YouTube for their entertainment. Considering all these good sides, one cannot unsee the harms of YouTube.
Along with that, YouTube is one of the biggest sources of fake news. Channels with anonymous names and hidden agendas spread fake news and befool viewers easily. But now according to YouTube, they are trying their best to prevent the spread of misinformation.
Chief Product Officer Neal Mohan stated in a blog that there are three main areas that need to be addressed, to tackle false information before it reaches the audience, to deal with the situation of sharing videos on different platforms, and lastly to prevent the expansion of false news not just in English but in other languages as well.
In January, 80 fact-checking organizations from around the world complained YouTube was not executing much to avoid the growth of misinformation. According to a signed letter to the business, it was concluded that YouTube needs to push itself a bit harder, whatever YouTube is following, it's apparently not sufficient enough.
The letter stated that YouTube is letting irresponsible people exploit the useful platform and in turn earn from it. It’s time to step up the game and take concrete steps to combat deception and misinformation. It thus appears to have prompted YouTube to take action. The company claims that its blend of artificial intelligence and human tools can quickly remove content, but admits that it's not great enough and that it requires change.
To tackle this, YouTube added that they are constantly educating the system on updated information. To discover storylines that the primary classifier missed, they are aiming to use an even more specific blend of classifiers, keywords in multiple languages, and input from regional experts.
On the other hand, there is another problem of borderline stories that does not involve any suspicious content according to the guidelines but the content should not become popular on YouTube. Hiding the share option or disconnecting the link on videos, according to YouTube, is an easy remedy. However, they are debating whether prohibiting shares goes farther in limiting a viewer's choice. YouTube suggested that a preferable method may be to add a "speed bump" to these videos, a caution that informs users that the uploaded video they're going to view may include some suspect content. This allows consumers to decide if they will still watch the video or the warning might trigger them to do more research.
Read next: YouTube has rolled out multiple updates on its platform, including accessibly of shorts on creators channel pages
Along with that, YouTube is one of the biggest sources of fake news. Channels with anonymous names and hidden agendas spread fake news and befool viewers easily. But now according to YouTube, they are trying their best to prevent the spread of misinformation.
Chief Product Officer Neal Mohan stated in a blog that there are three main areas that need to be addressed, to tackle false information before it reaches the audience, to deal with the situation of sharing videos on different platforms, and lastly to prevent the expansion of false news not just in English but in other languages as well.
In January, 80 fact-checking organizations from around the world complained YouTube was not executing much to avoid the growth of misinformation. According to a signed letter to the business, it was concluded that YouTube needs to push itself a bit harder, whatever YouTube is following, it's apparently not sufficient enough.
The letter stated that YouTube is letting irresponsible people exploit the useful platform and in turn earn from it. It’s time to step up the game and take concrete steps to combat deception and misinformation. It thus appears to have prompted YouTube to take action. The company claims that its blend of artificial intelligence and human tools can quickly remove content, but admits that it's not great enough and that it requires change.
To tackle this, YouTube added that they are constantly educating the system on updated information. To discover storylines that the primary classifier missed, they are aiming to use an even more specific blend of classifiers, keywords in multiple languages, and input from regional experts.
On the other hand, there is another problem of borderline stories that does not involve any suspicious content according to the guidelines but the content should not become popular on YouTube. Hiding the share option or disconnecting the link on videos, according to YouTube, is an easy remedy. However, they are debating whether prohibiting shares goes farther in limiting a viewer's choice. YouTube suggested that a preferable method may be to add a "speed bump" to these videos, a caution that informs users that the uploaded video they're going to view may include some suspect content. This allows consumers to decide if they will still watch the video or the warning might trigger them to do more research.
Read next: YouTube has rolled out multiple updates on its platform, including accessibly of shorts on creators channel pages