Instagram Rejects 'Flawed' Study Claiming It Boosts Semi-Nude Photos

Instagram has rejected the findings of a study claiming that the picture-sharing app's algorithm prioritizes snaps of "scantily-clad" men and women.

Research published on Monday by the non-profit advocacy organization Algorithm Watch accused the Facebook-owned application of boosting semi-nude content, claiming that choosing not to do so "dramatically curtails one's audience" and follower reach.

Authors of the study said they set out to examine how the algorithm arranges content in a user's feed, judging what images and videos were pushed to the top.

Broadly, the experiment claimed to "present strong evidence that pictures which show more skin are shown to users more often than pictures that don't." Instagram, however, has attacked the study's methodology and flatly rejected its conclusions.

"There's been a recent study that suggests we boost content specifically because it contains semi-nudity. This is not true," Instagram's PR team tweeted Tuesday.

"We surface posts based on interests, timeliness of posts, and other factors to help people discover content most relevant to them," it continued.

"The study looked at an extremely small sample size, which likely surfaced the type of content they were researching: the more you engage with certain types of posts, the more likely we are to show you similar posts," it added. "This research is flawed in a number of ways and shows a misunderstanding of how Instagram works."

1/ There's been a recent study that suggests we boost content specifically because it contains semi-nudity. This is not true. We surface posts based on interests, timeliness of posts, and other factors to help people discover content most relevant to them.

— Instagram Comms (@InstagramComms) June 16, 2020

3/ This research is flawed in a number of ways and shows a misunderstanding of how Instagram works. We will be releasing more information about what posts we do and don't recommend in the coming weeks.

— Instagram Comms (@InstagramComms) June 16, 2020

According to Algorithm Watch, a total of 26 volunteers took part in the study. They installed a Firefox browser add-on and then asked to follow a variety of Instagram creators who use the app to share content in the food, travel, fitness, fashion or beauty sectors.

The add-on would automatically open the Instagram app at regular intervals and check what selection of posts were appearing at the top of the volunteer's feed, potentially indicating what the app deemed most relevant to each person at the time.

"If Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user. This is not what we found," researchers wrote in a blog this week.

According to the research team, of 1,737 posts published by 37 chosen content creators between February and May this year, roughly 21 percent were recognized by a computer program as containing women in bikinis or underwear, or bare-chested men.

However, researchers said posts with racy pictures from the content creator accounts made up 30 percent of all posts shown in the feeds of the volunteers.

Statistically, the team claimed the posts of women in underwear or bikinis were found to be 54 percent more likely to appear in the feeds of volunteers.

Posts containing pictures of bare-chested men were 28 percent more likely to be shown. In comparison, posts showing pictures of food or landscape were about 60 percent less likely to be shown in volunteers' Instagram feeds, the authors claimed. Algorithm Watch has been contacted for comment on the methodology behind those statistics.

The authors conceded the skew towards nudity may not apply to all users, but they had "reasons to believe" the findings are indicative of how Instagram operates.

"While it was consistent and apparent for most volunteers, a small minority were served posts that better reflected the diversity published by content creators," the team wrote.

"It is likely that Instagram's algorithm favors nudity in general, but that personalization, or other factors, limits this effect for some users," researchers added.

A more-detailed data analysis summary of the Instagram project was published online, noting that a larger sample size would be needed to further the study.

"Sexually suggestive images, as well as nudity from either gender, appeared significantly more often in data donors' newsfeeds than in the posts created by monitored accounts. This effect was observed in most, if not all, of our data donors," it concluded.

Facebook said in response that it ranks posts "based on content and accounts you have shown an interest in, not on arbitrary factors like the presence of swimwear."

In addition, the Instagram communications Twitter account said more information about how users' posts are recommended will be published in the coming weeks.

According to Instagram's own metrics, more than 500 million accounts across the world are active every day on the platform, rising to one billion active on a monthly basis.

Instagram
In this photo illustration, the social media application logo, Instagram is displayed on the screen of a computer on March 15, 2019 in Paris, France. Chesnot/Getty