The increasing role of social media in today’s time cannot be denied. And when it comes down to leading platforms like YouTube, you can’t help but wonder if it plays a role in brainwashing people’s minds.
For instance, let’s imagine your youngest child is busy watching their favorite animated series. While everything appears to be going alright, suddenly an ad pops up that redirects them to another site such as an extremist organization.
Now, it makes sense that you would probably blame the platform’s algorithm in this regard for trying to brainwash others into a direction that they’re least interested in right? But wait, it’s not that simple.
A recent report was conducted by academic-based researchers who performed one in-depth analysis on how influential is YouTube at misleading its users.
And you’ll be surprised to learn that some very interesting results were produced where it was unveiled how the app rarely, if ever, gives its viewers suggestions about content that’s outside of their beliefs.
Instead, those that actually make the extra effort of searching for such content that’s out of their mainstream are the ones that will be given suggestions. So, the rule is logical. The more you search, the more results or suggestions you get based on your search. And we believe sometimes that’s fair.
It was also very interesting to note how YouTube isn’t a fan of promoting conspiracy theories or bizarre matters that have no sound ground. If you don’t show interest in the material, you’re not going to be bothered by the app’s algorithm with suggestions pertaining to that genre.
But wait, it’s not all that simple and straightforward. The study also found that the platform could also sometimes behave as an external force in the realm of radicalization.
The leading paper highlighted that volunteers who already possessed such strong and radical views on a certain topic or even followed a channel on the app religiously were much more likely to be provided with recommendations of a similar sort.
So, what do these findings really mean, at the end of the day?
Well, the study delineates the strong need for policymakers, the general public, and leading internet executives to take a moment and think smartly. There is no need to focus on the common everyday user being misguided by YouTube to enter into some sort of extremist behavior. Instead, greater emphasis should be laid upon policies designed to harden the opinions of those users who are already inclined in that direction.
One college professor who also happened to be a co-author of the study says that we’re all guilty of understating the way social media assists demand by linking the supply of the most extreme views. Moreover, he adds that even one person who may have an extreme viewpoint on a matter had the potential to harm the world.
Remember, YouTube is a popular platform and it makes so much sense that since people are watching over a billion hours of videos through the app’s channels, concerns are growing. This is similar to the concerns that surround Facebook too.
While this is just one part of the interesting study, it’s quite interesting to see the study point out that weird things on the web cause little harm and also how the app’s algorithm puts so many people at risk of turning into a monster.
The research also noted how most of the viewership for extremist-related content belonged to viewers who had great resentment for others, either racially or due to gender. But remember, the research came into effect in 2020.
We’re all well aware that YouTube did make some great changes after that in terms of eliminating videos that misinform others in a negative way. Also, the study is yet to be overviewed by independent experts.
Moreover, there is a need for more follow-up studies to take place that can help verify the results mentioned and also greater talk on how YouTube can do more to tackle matters like these and reduce harmful exposure.
Read next: Google Changes Cookie Handling Of Its Services Across Europe By Giving Users The Choice to ‘Reject All’
For instance, let’s imagine your youngest child is busy watching their favorite animated series. While everything appears to be going alright, suddenly an ad pops up that redirects them to another site such as an extremist organization.
Now, it makes sense that you would probably blame the platform’s algorithm in this regard for trying to brainwash others into a direction that they’re least interested in right? But wait, it’s not that simple.
A recent report was conducted by academic-based researchers who performed one in-depth analysis on how influential is YouTube at misleading its users.
And you’ll be surprised to learn that some very interesting results were produced where it was unveiled how the app rarely, if ever, gives its viewers suggestions about content that’s outside of their beliefs.
Key findings https://t.co/VfcOUkSZ9t
— Brendan Nyhan (@BrendanNyhan) April 21, 2022
-Viewership of potentially harmful alternative & extremist YouTube channels heavily concentrated among subscribers
-Viewers frequently come from off-platform sources
-Algorithmic recs from normal content rare
-"Rabbit hole" patterns very rare https://t.co/fai8VrvPD1 pic.twitter.com/UNGuQrWK6l
Instead, those that actually make the extra effort of searching for such content that’s out of their mainstream are the ones that will be given suggestions. So, the rule is logical. The more you search, the more results or suggestions you get based on your search. And we believe sometimes that’s fair.
It was also very interesting to note how YouTube isn’t a fan of promoting conspiracy theories or bizarre matters that have no sound ground. If you don’t show interest in the material, you’re not going to be bothered by the app’s algorithm with suggestions pertaining to that genre.
But wait, it’s not all that simple and straightforward. The study also found that the platform could also sometimes behave as an external force in the realm of radicalization.
The leading paper highlighted that volunteers who already possessed such strong and radical views on a certain topic or even followed a channel on the app religiously were much more likely to be provided with recommendations of a similar sort.
So, what do these findings really mean, at the end of the day?
Well, the study delineates the strong need for policymakers, the general public, and leading internet executives to take a moment and think smartly. There is no need to focus on the common everyday user being misguided by YouTube to enter into some sort of extremist behavior. Instead, greater emphasis should be laid upon policies designed to harden the opinions of those users who are already inclined in that direction.
One college professor who also happened to be a co-author of the study says that we’re all guilty of understating the way social media assists demand by linking the supply of the most extreme views. Moreover, he adds that even one person who may have an extreme viewpoint on a matter had the potential to harm the world.
Remember, YouTube is a popular platform and it makes so much sense that since people are watching over a billion hours of videos through the app’s channels, concerns are growing. This is similar to the concerns that surround Facebook too.
While this is just one part of the interesting study, it’s quite interesting to see the study point out that weird things on the web cause little harm and also how the app’s algorithm puts so many people at risk of turning into a monster.
The research also noted how most of the viewership for extremist-related content belonged to viewers who had great resentment for others, either racially or due to gender. But remember, the research came into effect in 2020.
We’re all well aware that YouTube did make some great changes after that in terms of eliminating videos that misinform others in a negative way. Also, the study is yet to be overviewed by independent experts.
Moreover, there is a need for more follow-up studies to take place that can help verify the results mentioned and also greater talk on how YouTube can do more to tackle matters like these and reduce harmful exposure.
Read next: Google Changes Cookie Handling Of Its Services Across Europe By Giving Users The Choice to ‘Reject All’