Google has promised to get rid of all fake claims regarding the American Election fraud of 2020 from its YouTube app.
The tech giant says there is one content too many that have continuously been spreading across the app since 2020 and how the American presidential race was rigged. Now, it hopes to remove all of that, it mentioned on Friday.
YouTube delineated through a blog post how it was making a huge decision to stay on track and strike a balance with some of its leading goals. This includes keeping the community safe and also enabling to provide a forum where users feel safe in sharing thoughts while carrying on with debates on matters they care about.
This decision comes at a time that’s right before the midterm elections which are all set to take center stage next year. And it also would be undoing a new policy that was put in place before the current president Joe Biden took charge.
The company did admit it should have acted sooner. So many years have passed and thousands of videos took center stage with removals following close behind. But better late than never, the firm says it’s decided to recheck its policy and the changing landscape of today’s day and age.
Today’s current surrounding calls for stricter regulation and moderation while many people feel getting rid of content would certainly put a pause on misinformation spread. On the other hand, it might have the effect of affecting things like speech in the world of politics but not limiting the great violence risk which arises with it.
As far as when we can expect the new rule to come into effect, well, that’s going to be from today.
During the last leg of the American elections, YouTube was heavily criticized for being careless about acting with delay in terms of such content. It had trouble deleting videos that featured misinformation and fakely claimed fraud done by voters. Then when the Capitol attack took center stage in 2021, the app claimed it would be suspending all channels that made fake allegations regarding voter fraud.
Since March of this year, the company says it ended up lifting the ban that was put on former American President Donald Trump’s account after the January 6 Capitol mayhem.
Now, YouTube claims there are so many types of misinformation across the board such as highlighting sources on the search and putting an end to posts that are designed for misleading people that wish to vote.
Read next: Here’s What Creators Need to Know About the YouTube Algorithm in 2023
The tech giant says there is one content too many that have continuously been spreading across the app since 2020 and how the American presidential race was rigged. Now, it hopes to remove all of that, it mentioned on Friday.
YouTube delineated through a blog post how it was making a huge decision to stay on track and strike a balance with some of its leading goals. This includes keeping the community safe and also enabling to provide a forum where users feel safe in sharing thoughts while carrying on with debates on matters they care about.
This decision comes at a time that’s right before the midterm elections which are all set to take center stage next year. And it also would be undoing a new policy that was put in place before the current president Joe Biden took charge.
The company did admit it should have acted sooner. So many years have passed and thousands of videos took center stage with removals following close behind. But better late than never, the firm says it’s decided to recheck its policy and the changing landscape of today’s day and age.
Today’s current surrounding calls for stricter regulation and moderation while many people feel getting rid of content would certainly put a pause on misinformation spread. On the other hand, it might have the effect of affecting things like speech in the world of politics but not limiting the great violence risk which arises with it.
As far as when we can expect the new rule to come into effect, well, that’s going to be from today.
During the last leg of the American elections, YouTube was heavily criticized for being careless about acting with delay in terms of such content. It had trouble deleting videos that featured misinformation and fakely claimed fraud done by voters. Then when the Capitol attack took center stage in 2021, the app claimed it would be suspending all channels that made fake allegations regarding voter fraud.
Since March of this year, the company says it ended up lifting the ban that was put on former American President Donald Trump’s account after the January 6 Capitol mayhem.
Now, YouTube claims there are so many types of misinformation across the board such as highlighting sources on the search and putting an end to posts that are designed for misleading people that wish to vote.
Read next: Here’s What Creators Need to Know About the YouTube Algorithm in 2023