A new study is unraveling the dark side of Instagram and its algorithm where alarming statistics proved how young teens are being recommended sexually charged content.
The recommendations feature creators in ‘barely there’ attire where underage viewers as young as 13 were a part of the audience.
The news arose in the latest study from WSJ and another Northeastern University which says it’s about time Meta took notice and serious action against the behavior while calls for banning such accounts were also made by activists noting down the issue.
On the first visit, researchers saw how young viewers’ recommendations were mostly from females dancing around in revealing clothing where their cleavage was completely visible.
The fact that viewers were seeing these posts more often meant that graphic content kept on becoming a part of the algorithm’s recommendations which many feel is just not okay. Most of the videos were a part of the app’s Reels feed and suggestions were mindblogging for some.
Other alarming content featured sex workers promising viewers nude pictures in their DMs. And that again was happening so frequently, the published study explained.
In another set of fixed tests carried out in June, the Journal began explaining how content after content became more graphic in nature including anal sex being openly promoted to those as young as 13, provided the viewer saw Reels related to women on their feed.
Seeing the algorithm go to this extent to serve content of this kind where females caressed their bodies and did everything from flashing private parts to more was just a lot for the authors to handle, adding how Instagram’s algorithm needed a big-time clean-up before matters became worse.
Meta has pushed back on those findings of the report, where the spokesperson brushed the matter off as just an experiment conducted that didn’t match up in real-time with what youngsters were being exposed to.
Meta says it has time and time again managed to reduce sensitive themed content linked to teens to such an extent that it’s at a bare minimum, completely eliminating what the research had to say.
The fact that this study was carried out for seven months is again alarming as Meta keeps denying how the results were just part of an artificial experiment and nothing more. While the test was certainly carried out to see how the app responded to illicit recommendations, it’s mind-boggling how so much is being ignored and for so long in that experiment.
Over the years, the app’s algorithm has been questioned in the past too but the tech giant keeps claiming it’s rolling out stricter measures to ensure underage users remain safe at all times on the app.
One example of that is those below the age of 16 are automatically blocked from viewing explicit content and recommendations.
Image: DIW-Aigen
Read next:
• Reddit Witnesses Massive Growth In Visibility On Google Search But It Comes With Implications
• 59% of Marketers Report New Business Opportunities from Content Marketing
• Which Global Airports Have The Fastest Wi-Fi Speeds? The Answer Might Surprise You
The recommendations feature creators in ‘barely there’ attire where underage viewers as young as 13 were a part of the audience.
The news arose in the latest study from WSJ and another Northeastern University which says it’s about time Meta took notice and serious action against the behavior while calls for banning such accounts were also made by activists noting down the issue.
On the first visit, researchers saw how young viewers’ recommendations were mostly from females dancing around in revealing clothing where their cleavage was completely visible.
The fact that viewers were seeing these posts more often meant that graphic content kept on becoming a part of the algorithm’s recommendations which many feel is just not okay. Most of the videos were a part of the app’s Reels feed and suggestions were mindblogging for some.
Other alarming content featured sex workers promising viewers nude pictures in their DMs. And that again was happening so frequently, the published study explained.
In another set of fixed tests carried out in June, the Journal began explaining how content after content became more graphic in nature including anal sex being openly promoted to those as young as 13, provided the viewer saw Reels related to women on their feed.
Seeing the algorithm go to this extent to serve content of this kind where females caressed their bodies and did everything from flashing private parts to more was just a lot for the authors to handle, adding how Instagram’s algorithm needed a big-time clean-up before matters became worse.
Meta has pushed back on those findings of the report, where the spokesperson brushed the matter off as just an experiment conducted that didn’t match up in real-time with what youngsters were being exposed to.
Meta says it has time and time again managed to reduce sensitive themed content linked to teens to such an extent that it’s at a bare minimum, completely eliminating what the research had to say.
The fact that this study was carried out for seven months is again alarming as Meta keeps denying how the results were just part of an artificial experiment and nothing more. While the test was certainly carried out to see how the app responded to illicit recommendations, it’s mind-boggling how so much is being ignored and for so long in that experiment.
Over the years, the app’s algorithm has been questioned in the past too but the tech giant keeps claiming it’s rolling out stricter measures to ensure underage users remain safe at all times on the app.
One example of that is those below the age of 16 are automatically blocked from viewing explicit content and recommendations.
Image: DIW-Aigen
Read next:
• Reddit Witnesses Massive Growth In Visibility On Google Search But It Comes With Implications
• 59% of Marketers Report New Business Opportunities from Content Marketing
• Which Global Airports Have The Fastest Wi-Fi Speeds? The Answer Might Surprise You