Experiment Shows Instagram Encourages Removal of Clothing
“My husband is always watching videos of half naked women dancing and showing off their bodies on Instagram,” a troubled wife wrote on a community forum in February this year. She continued by saying that her husband wants to have sex with her as “a release from the sexual urges of looking at these women.” This young wife explained that although people have told her she is attractive, she feels ugly compared to the images of models her husband consumes.
A month before, a TikTok influencer shared with her followers that her boyfriend masturbates to images of Instagram models.
Multiple community forums, chat rooms, and blogs are telling a similar story: Instagram has become the new Playboy, the primary vehicle to erotic content for hundreds of thousands of men.
Wives are not happy about Instagram, but neither are many men. Here is the testimony of James M. Costa, from his article, “Let’s Talk about Instagram’s Growing Sexualization”:
“Browsing through Instagram was often a triggering experience that would get me horny and eventually lead me to watch porn. It’s not surprising that, in the sex and porn addiction communities, an increasingly popular piece of advice given to the newcomers is to either uninstall Instagram or use it with forethought.”
Nothing I have shared so far will likely be a surprise. It is common knowledge that Instagram is a marriage-killer, and a gateway to the dark world of internet pornography. But what may surprise you is that Instagram's algorithm seems designed to encourage people to take off their clothes.
“skew towards nudity”
Writing for the non-profit research and advocacy organization Algorithm Watch, Nicolas Kayser-Bril shared the results of an exclusive investigation into Instagram’s algorithm. Through a series of tests, European researchers discovered that the picture-sharing app has a “skew towards nudity.”
Specifically, posts containing semi-naked women were 54 percent more likely to appear in the news feeds of volunteers participating in the study.
Though this study was not a formal audit of Instagram’s algorithm, it was among the most advanced experiments ever conducted into the picture-sharing platform. The research was supported by the European Data Journalism Network and by the Dutch foundation SIDN, and was conducted in partnership with Mediapart in France, Groene Amsterdammer and Pointer in the Netherlands, and Süddeutsche Zeitung in Germany.
Last year Nicolas Kayser-Bril explained the impact of the study:
Undress or Fail
Although Instagram’s algorithm is secret, we can make inferences about the algorithm by observing its effects. That is what the aforementioned experiment attempted to do.
Twenty-six volunteers were asked to install an Instagram browser add-on onto their computers, and then to follow professional content creators.
“The add-on automatically opens the Instagram homepage at regular intervals and notes which posts appear on top of the volunteers’ newsfeeds, providing an overview of what the platforms considers most relevant to each volunteer,” Kayser-Bril explained for the website Algorithm Watch. What happened next is where the study became really interesting:
If Instagram were not meddling with the algorithm, the diversity of posts in the newsfeed of users should match the diversity of the posts by the content creators they follow. And if Instagram personalized the newsfeed of each user according to their personal tastes, the diversity of posts in their newsfeeds should be skewed in a different way for each user. This is not what we found.
Between February and May, 1,737 posts published by the content creators we monitor, containing 2,400 photos, were analyzed. Of these posts, 362, or 21%, were recognized by a computer program as containing pictures showing women in bikinis or underwear, or bare chested men. In the newsfeeds of our volunteers, however, posts with such pictures made up 30% of all posts shown from the same accounts (some posts were shown more than once).
Posts that contained pictures of women in undergarment or bikini were 54% more likely to appear in the newsfeed of our volunteers. Posts containing pictures of bare chested men were 28% more likely to be shown. By contrast, posts showing pictures of food or landscape were about 60% less likely to be shown in the newsfeed.
The implications of this study are clear. Content creators who use Instagram for business purposes are unlikely to succeed if they are not willing to take off their clothes. The algorithm seems designed to force businesses into a choice: undress or fail.
Facebook Patent Includes “State of Undress” As Factor Impacting Engagement
Kayser-Bril hypothesized that Instagram’s bias towards erotic content could be a result of a mechanism used by Instagram and Facebook known as an “engagement metric.” The engagement metric automatically prioritizes posts based on the user’s past behavior as well as the past behavior of all users:
Whether or not users see the pictures posted by the accounts they follow depends not only on their past behavior, but also on what Instagram believes is most engaging for other users of the platform.
Might people in a state of undress be recognized by the engagement metric as engaging to most users? A clue to answering this question is found in the patent for the engagement metric, published by Facebook engineers in 2015. The patent uses a variety of criteria as part of the “feature-extraction-based image scoring,” to determine which posts to prioritize in users’ newsfeeds. According to the patent, the “state of undress” in a photo may impact the engagement metric.
This means that if lots of users register interest in, for example, a woman sticking out her butt to a camera, or a half-naked girl reclining in bed with a look of availability, then the algorithm will learn to prioritize that in user’s feeds even if that specific user has not previously registered interest in those types of images.
"[instagram] patent specifically states that the gender, ethnicity and “state of undress” of people in a photo could be used to compute the engagement metric"https://t.co/Nbh31fB7Fr #edjnet
— Julien Leterrier (@LeJetlinerRieur) June 28, 2020
Algorithm Watch Threatened by Facebook
Kayser-Bril did his initial analysis for Algorithm Watch in May 2020. Since then, Algorithm Watch attempted to gather more data. But their work was cut abruptly short in May of last year when Facebook executives requested a meeting. Facebook threatened action if Algorithm Watch did not immediately stop researching. Facebook claimed the researchers were violating a “Terms of Service” which prohibited the automated collection of data. Apparently, Facebook considered it a violation of their terms for volunteers to access their own feed for research purposes.
In his article “AlgorithmWatch forced to shut down Instagram monitoring project after threats from Facebook,” Nicolas Kayser-Bril commented,
Facebook’s reaction shows that any organization that attempts to shed light on one of their algorithms is under constant threat of being sued. Given that Facebook’s Terms of Service can be updated at their discretion (with 30 days’ notice), the company could forbid any ongoing analysis that aims at increasing transparency, simply by changing its Terms.
As a result of these threads, Algorithm Watch had to suspend further research, announcing, "Ultimately, an organization the size of AlgorithmWatch cannot risk going to court against a company valued at one trillion dollars."
Not The End of the Story
In times past, a person would need to travel into disreputable parts of town to see the sights that Instagram now makes available to anyone with a simple internet connection. This may be morally objectionable, but Instagram’s business model depends on a critical mass of individuals being addicted to erotic content.
The experiments that Kayser-Bril and Algorithm Watch were performing were valuable in demonstrating this, and proving that Instagram’s algorithm is incentivizing creators to trade in erotic content. No wonder Facebook tried to strong-arm Algorithm Watch into closing down their research! Yet this is not the end of the story.
Recent developments this month have catapulted Instagram back into the limelight, and Instagram may finally be forced into accountability. I will report on the latest developments in a follow-up post. I will also be sharing steps that you and your family can take to protect yourselves and your children from the pornification of social media.
Further Reading
- How Instagram Recruits Girls to Sexualized Labor: Mark Zuckerberg Has a Harem of Sex Workers that Facebook Doesn’t Want You to Know About
- How Sex Traffickers Use Social Media and Modeling: The Dark Secret Instagram Doesn't Want You to Know About
has a Master’s in History from King’s College London and a Master’s in Library Science through the University of Oklahoma. He is the blog and media managing editor for the Fellowship of St. James and a regular contributor to Touchstone and Salvo. He has worked as a ghost-writer, in addition to writing for a variety of publications, including the Colson Center, World Magazine, and The Symbolic World. Phillips is the author of Gratitude in Life's Trenches (Ancient Faith, 2020) and Rediscovering the Goodness of Creation (Ancient Faith, 2023) and co-author with Joshua Pauling of Are We All Cyborgs Now? Reclaiming Our Humanity from the Machine (Basilian Media & Publishing, 2024). He operates the substack "The Epimethean" and blogs at www.robinmarkphillips.com.
• Get SALVO blog posts in your inbox! Copyright © 2024 Salvo | www.salvomag.com https://salvomag.com/post/algorithmic-erotica