Around a million videos suspected to be terrorism-related were manually reviewed by Google, as per the Reuters findings. 9% (almost 90,000) videos found to be violating the terrorism policy and were removed from the platform.
While presenting the numbers of these videos in front of US House panel, Google said they have dedicated 10,000 people for reviewing content and have been spending hundreds of millions yearly to make their efforts more efficient.
All the leading tech companies, Facebook Twitter, Microsoft, and Google were asked to share their budget they have been spending to counter terrorism online. None of them were able to give their exact figures.
The Christchurch incident in New Zealand has built pressure on these social media giants to keep a strict check on the content available on the platform.
After the incident, every second a shooting video was being uploaded on YouTube over the weekend.
Considering this, legislation was made in Australia according to which Social Media platforms were responsible for removing such violent content.
European Union is also thinking of introducing a law after which social media platforms will be expected to remove terrorist content in an hour after it comes to notice.
To come up to these standards, Google will need to improve its system to flag videos that can be then manually reviewed.
Read next: YouTube May Soon Let You Buy Products Directly From its Platform
While presenting the numbers of these videos in front of US House panel, Google said they have dedicated 10,000 people for reviewing content and have been spending hundreds of millions yearly to make their efforts more efficient.
All the leading tech companies, Facebook Twitter, Microsoft, and Google were asked to share their budget they have been spending to counter terrorism online. None of them were able to give their exact figures.
The Christchurch incident in New Zealand has built pressure on these social media giants to keep a strict check on the content available on the platform.
After the incident, every second a shooting video was being uploaded on YouTube over the weekend.
- Also read: YouTube Boasts 2 Billion Monthly Active Users, 250 Million Hours Watched on TV Screens Every Day
Considering this, legislation was made in Australia according to which Social Media platforms were responsible for removing such violent content.
European Union is also thinking of introducing a law after which social media platforms will be expected to remove terrorist content in an hour after it comes to notice.
To come up to these standards, Google will need to improve its system to flag videos that can be then manually reviewed.
Read next: YouTube May Soon Let You Buy Products Directly From its Platform