The latest studies suggest that the algorithm of Instagram makes images showing skin more likely to appear in the newsfeed of users. This means that Instagram prioritizes scantily-clad photos of people. This might also be the reason your Instagram newsfeed appears to be filled with thirst traps.
A team of researchers from the European Data Journalism Network and the AlgorithmWatch discovered that Instagram prioritizes these types of images in the newsfeed of users. The researchers analyzed Instagram newsfeeds and spoke to content creators for this purpose. They also studied patents to make this discovery.
The team of researchers asked twenty-six volunteers to download and install a browser add-on that would automatically open the Instagram homepage of volunteers at regular intervals. It also records posts that Instagram displays at the top of newsfeeds. The researchers then asked the volunteers to follow a selection of 37 professional content creators from 12 countries who use the platform to advertise their business and attract new customers.
The team discovered that 362 images out of 2,400 images posted by content creators displayed bare-chested men and women in bikinis or underwear. The team expected that the volunteers would experience a similar diversity of content if Instagram was not prioritizing these types of images. However, 30% of the posts appearing in the newsfeeds of volunteers contained semi-nude photos.
It was 54% more likely for photos of scantily-clad women to pop up in the newsfeeds of volunteers while it was 28% more likely for pictures of bare-chested men to be shown. In comparison, photos of landscape or food appeared 60% less likely.
A reporter at AlgorithmWatch, Nicolas Kayser-Bril explained in a tweet that some people use Instagram as a free of ‘soft porn photos.’ The tweet also says that machine learning (ML) picks up this behavior of users and after amplifying, ML pushed photos of nudity to all Instagram users.
This bias may push professional content creators on Instagram into posting revealing content on the platform to attract more users. This algorithm bias will particularly push women into posting revealing pictures. This bias could also help to shape the worldview of 1 billion monthly active users of Instagram.
However, the team of researchers also admitted that this algorithm bias did not apply to all volunteers. The algorithm of Instagram promotes revealing content generally, but other factors like personalization limit its effect for some Instagram users.
According to the team of researchers, we cannot draw concrete conclusions based on this data as we do not have access to internal data. They stated that we can draw concrete conclusions if we have access to Facebook’s internal data and its production servers. The team of researchers intends to investigate further on this aspect by recruiting extra volunteers.
Facebook-owned Instagram disputed this discovery stating that there are several flaws in this research. The company stated that this research displays a misunderstanding of how Instagram’s algorithm operates. Instagram ranks posts in the newsfeeds of users based on the type of content and accounts they follow, explained Instagram Comms team in a tweet.
This yet again highlights another issue that most social networks expect from their users to stay as much as possible on their respective platforms by offering several features, playing with users psychology and creating an echo chamber that consistently offers such types of content that makes users feel more glued to their news feeds.
The researchers from European Data Journalism Network and the AlgorithmWatch believe that Instagram’s algorithm operates according to their findings. The researchers note that Facebook released a patent displaying the factors that determine what type of content to prioritize. One of the factors mentioned in the patent is, ‘state of undress.’ This suggests that Instagram’s algorithm may choose content based on what ‘Facebook thinks’ users want to let them stay inside their filtered bubble.
Read next: Lurking on Locations: Exploring How people are using location services and social media check-ins
A team of researchers from the European Data Journalism Network and the AlgorithmWatch discovered that Instagram prioritizes these types of images in the newsfeed of users. The researchers analyzed Instagram newsfeeds and spoke to content creators for this purpose. They also studied patents to make this discovery.
The team of researchers asked twenty-six volunteers to download and install a browser add-on that would automatically open the Instagram homepage of volunteers at regular intervals. It also records posts that Instagram displays at the top of newsfeeds. The researchers then asked the volunteers to follow a selection of 37 professional content creators from 12 countries who use the platform to advertise their business and attract new customers.
The team discovered that 362 images out of 2,400 images posted by content creators displayed bare-chested men and women in bikinis or underwear. The team expected that the volunteers would experience a similar diversity of content if Instagram was not prioritizing these types of images. However, 30% of the posts appearing in the newsfeeds of volunteers contained semi-nude photos.
It was 54% more likely for photos of scantily-clad women to pop up in the newsfeeds of volunteers while it was 28% more likely for pictures of bare-chested men to be shown. In comparison, photos of landscape or food appeared 60% less likely.
A reporter at AlgorithmWatch, Nicolas Kayser-Bril explained in a tweet that some people use Instagram as a free of ‘soft porn photos.’ The tweet also says that machine learning (ML) picks up this behavior of users and after amplifying, ML pushed photos of nudity to all Instagram users.
This bias may push professional content creators on Instagram into posting revealing content on the platform to attract more users. This algorithm bias will particularly push women into posting revealing pictures. This bias could also help to shape the worldview of 1 billion monthly active users of Instagram.
However, the team of researchers also admitted that this algorithm bias did not apply to all volunteers. The algorithm of Instagram promotes revealing content generally, but other factors like personalization limit its effect for some Instagram users.
According to the team of researchers, we cannot draw concrete conclusions based on this data as we do not have access to internal data. They stated that we can draw concrete conclusions if we have access to Facebook’s internal data and its production servers. The team of researchers intends to investigate further on this aspect by recruiting extra volunteers.
Facebook-owned Instagram disputed this discovery stating that there are several flaws in this research. The company stated that this research displays a misunderstanding of how Instagram’s algorithm operates. Instagram ranks posts in the newsfeeds of users based on the type of content and accounts they follow, explained Instagram Comms team in a tweet.
This yet again highlights another issue that most social networks expect from their users to stay as much as possible on their respective platforms by offering several features, playing with users psychology and creating an echo chamber that consistently offers such types of content that makes users feel more glued to their news feeds.
The researchers from European Data Journalism Network and the AlgorithmWatch believe that Instagram’s algorithm operates according to their findings. The researchers note that Facebook released a patent displaying the factors that determine what type of content to prioritize. One of the factors mentioned in the patent is, ‘state of undress.’ This suggests that Instagram’s algorithm may choose content based on what ‘Facebook thinks’ users want to let them stay inside their filtered bubble.
Read next: Lurking on Locations: Exploring How people are using location services and social media check-ins