TikTok is the world’s leading platform when it comes to short video content. Users, whether young or old can’t get enough of the app. But did you ever wonder how the platform’s algorithm decides what users like?
Time after time, we do get an eerie feeling that apps may know more about us than we think. It’s like you’re thinking about something and it pops up on your screen. While scary at first, you do begin to realize that it’s more than just a coincidence.
An interesting experiment was recently conducted by The Guardian to determine how one of the world’s leading apps actually ends up deciding what it is that users like and what they should be serving them while browsing.
TikTok actually has one of the strongest platforms out there. Not only is it super effective in guessing your likes and dislikes, but it’s also hard to study too. The experiment we’re discussing today went on to shed light on how the algorithm ends up treating different people.
The participants were provided the same rules. They had to make a new account on the app using their actual ID, despite the fact of having one. Next, they had to open up their For You page at the exact same time during the morning hours. And lastly, they had to jot down the first ten videos being served up on their app. This particular set of rules was repeated for one week.
Other than that, they weren’t bound to do anything else. The week-long experiment continued and they were entitled to like, follow, post, and more as per their liking. See, the whole purpose was to witness how this ‘For You’ page altered with time. Similarly, what assumptions did the app end up making about them?
Yes, the findings may not be too scientific but they end up looking at the TikTok experience as a whole and how it altered from one individual to another. Hence, it ended up raising questions about how the app alters people’s minds, engagement, and the whole information ecosystem.
The four people on whom the experiment was conducted had different opinions on how the algorithm ended up judging them. A lot of people were appalled at the way assumptions were made about them. While a few were relatable, others were not so much.
One college student claims that the algorithm ended up assuming she was a teen who would adore a 12 year’s humor. Unfortunately, that didn’t go down too well and her ‘For You’ page had weird jokes across the board and other content that she found to be beyond her liking. Some videos were about viral trends like those linked to napkins. She enjoyed that though and says she would watch it for hours.
She says that the app never managed to figure out she was a desi but it did think she was a Muslim thanks to her headscarf. So that’s why it put up videos in that genre.
Another participant who was a professional working woman says she was appalled at the types of dad jokes, silly humor content, and weird stick figure viral trend that popped up. She did get cooking and pet videos too but as a whole, she was not impressed as most content was the viral kind as the app assumed she was the older mom type.
The same went for the retired man who says he used the app for the first time. He claims the algorithm threw viral content his way while the rest of the videos seemed to be designed to evoke an emotion of some kind.
So as you can see, it all depends on what you like and what sort of content you engage with. We personally feel no algorithm can predict what’s going on in your mind unless you lead the way.
Read next: These Are the Biggest Security Threats According to Microsoft
Time after time, we do get an eerie feeling that apps may know more about us than we think. It’s like you’re thinking about something and it pops up on your screen. While scary at first, you do begin to realize that it’s more than just a coincidence.
An interesting experiment was recently conducted by The Guardian to determine how one of the world’s leading apps actually ends up deciding what it is that users like and what they should be serving them while browsing.
TikTok actually has one of the strongest platforms out there. Not only is it super effective in guessing your likes and dislikes, but it’s also hard to study too. The experiment we’re discussing today went on to shed light on how the algorithm ends up treating different people.
The participants were provided the same rules. They had to make a new account on the app using their actual ID, despite the fact of having one. Next, they had to open up their For You page at the exact same time during the morning hours. And lastly, they had to jot down the first ten videos being served up on their app. This particular set of rules was repeated for one week.
Other than that, they weren’t bound to do anything else. The week-long experiment continued and they were entitled to like, follow, post, and more as per their liking. See, the whole purpose was to witness how this ‘For You’ page altered with time. Similarly, what assumptions did the app end up making about them?
Yes, the findings may not be too scientific but they end up looking at the TikTok experience as a whole and how it altered from one individual to another. Hence, it ended up raising questions about how the app alters people’s minds, engagement, and the whole information ecosystem.
The four people on whom the experiment was conducted had different opinions on how the algorithm ended up judging them. A lot of people were appalled at the way assumptions were made about them. While a few were relatable, others were not so much.
One college student claims that the algorithm ended up assuming she was a teen who would adore a 12 year’s humor. Unfortunately, that didn’t go down too well and her ‘For You’ page had weird jokes across the board and other content that she found to be beyond her liking. Some videos were about viral trends like those linked to napkins. She enjoyed that though and says she would watch it for hours.
She says that the app never managed to figure out she was a desi but it did think she was a Muslim thanks to her headscarf. So that’s why it put up videos in that genre.
Another participant who was a professional working woman says she was appalled at the types of dad jokes, silly humor content, and weird stick figure viral trend that popped up. She did get cooking and pet videos too but as a whole, she was not impressed as most content was the viral kind as the app assumed she was the older mom type.
The same went for the retired man who says he used the app for the first time. He claims the algorithm threw viral content his way while the rest of the videos seemed to be designed to evoke an emotion of some kind.
So as you can see, it all depends on what you like and what sort of content you engage with. We personally feel no algorithm can predict what’s going on in your mind unless you lead the way.
Read next: These Are the Biggest Security Threats According to Microsoft