TikTok is one of the world’s leading social media platforms with a massive user base featuring users of all age brackets. But did you know the app’s highly active recommendation system is designed in a manner that keeps you coming back for more, even if the content in question is of a sensitive nature?
New studies are speaking about the algorithm being so sensitive and sending some people down a rabbit hole that’s filled with spiteful, racist, and just negative content. Still, the user would end up clicking on the video somehow, no matter how hateful the theme may be.
What’s interesting is how the algorithm seeming works in the most mysterious of ways. Thanks to a new trial by the media outlet The Guardian, we found out more about how trials on blank accounts displayed how fast-breaking news stories could make a conservative user go down a certain path.
We’ve seen many other tech giants like Facebook and Instagram have alluring algorithms that attract users, especially young males into entering a new mansphere domain. Now, we’re seeing how another similar trial of blank accounts without any interactions like clicking on content, liking it, or even putting out a comment can give rise to users going down certain rabbit holes.
At the start, it was a little hard to figure out which clear theme was in question and what was getting served via this platform. On that note, we saw a story linked to a church stabbing incident arise in April.
During the initial two days, TikTok rolled out another account featuring generic material related to how to indulge in iPhone hacks or other content related to the city that you’d normally expect from a user based in Australia with an iOS device.
Then with time, we saw material linked to the stabbing incident pop up which featured conservative-themed sermons from the bishop who was attacked.
Three months to that day, we’re seeing more such material still being served, whether the user asked for it or not. More content had to do with former US president Donald Trump, LGBTQ community rights, and other sensitive-themed topics. Did we mention drag queens?
When compared to the likes of Meta’s apps, we found that TikTok was more sensitive in nature as the smallest interactions would push out content that was similar to users they opted out from the start.
So the more you search or engage with content of a certain kind, the more you’ll witness it pop up on the feed. You can always opt-out by clicking on ‘not interested’ but still, it’s worth a mention of how the app is really in a league of its own in regards to its algorithms.
As confirmed by a leading professional for computational communication science, the app provides the most random recommendations and if you do interact early on, you’ll certainly be seeing a lot of that material for a long time.
Image: DIW-Aigen
Read next: Circle To Search Makes Its Way To ChromeOS and Chrome As New 'Drag To Search' Feature
New studies are speaking about the algorithm being so sensitive and sending some people down a rabbit hole that’s filled with spiteful, racist, and just negative content. Still, the user would end up clicking on the video somehow, no matter how hateful the theme may be.
What’s interesting is how the algorithm seeming works in the most mysterious of ways. Thanks to a new trial by the media outlet The Guardian, we found out more about how trials on blank accounts displayed how fast-breaking news stories could make a conservative user go down a certain path.
We’ve seen many other tech giants like Facebook and Instagram have alluring algorithms that attract users, especially young males into entering a new mansphere domain. Now, we’re seeing how another similar trial of blank accounts without any interactions like clicking on content, liking it, or even putting out a comment can give rise to users going down certain rabbit holes.
At the start, it was a little hard to figure out which clear theme was in question and what was getting served via this platform. On that note, we saw a story linked to a church stabbing incident arise in April.
During the initial two days, TikTok rolled out another account featuring generic material related to how to indulge in iPhone hacks or other content related to the city that you’d normally expect from a user based in Australia with an iOS device.
Then with time, we saw material linked to the stabbing incident pop up which featured conservative-themed sermons from the bishop who was attacked.
Three months to that day, we’re seeing more such material still being served, whether the user asked for it or not. More content had to do with former US president Donald Trump, LGBTQ community rights, and other sensitive-themed topics. Did we mention drag queens?
When compared to the likes of Meta’s apps, we found that TikTok was more sensitive in nature as the smallest interactions would push out content that was similar to users they opted out from the start.
So the more you search or engage with content of a certain kind, the more you’ll witness it pop up on the feed. You can always opt-out by clicking on ‘not interested’ but still, it’s worth a mention of how the app is really in a league of its own in regards to its algorithms.
As confirmed by a leading professional for computational communication science, the app provides the most random recommendations and if you do interact early on, you’ll certainly be seeing a lot of that material for a long time.
Image: DIW-Aigen
Read next: Circle To Search Makes Its Way To ChromeOS and Chrome As New 'Drag To Search' Feature