With one billion users, Instagram is one of the most popular social media apps in the world (behind Facebook and Youtube, of course). Its sleek, visual formatting attracts kids, teenagers, and adults alike, and it’s advertised as a family-friendly social media platform. And honestly, we love it—we post social images, feature Fighter photos, and drop some serious knowledge in our stories about the harmful effects of porn and its connections to sex trafficking.
Among teen users, Instagram is the most popular social media platform, aside from YouTube. They even have a “no nudity” policy, restricting nearly all nudity with exceptions of paintings and sculptures, breastfeeding, and post-mastectomy scarring. [1]
Yet somehow, porn is thriving on Instagram. Simple searches looking for “babes” or “sexy” will reveal the reality of explicit content on the platform. Basic hashtags such as #sexy, many times containing pornography, collectively have millions of posts, none of which are filtered for young users. (Take our word for it, don’t look it up.)
So why is it this way if Instagram has these strict, no-nudity policies?
Porn is everywhere. Literally.
Due to the accessible and free nature of modern pornography, porn performers are collectively making less money than they ever have, turning to endorsement deals, escorting on the side, and social media such as Snapchat and Instagram to create more revenue. Performers can make extra money through premium Instagram subscriptions to their accounts, or simply from generating more fans through active use of their Instagram accounts.
Protect Young Eyes, an organization that seeks to defend kids from inappropriate and harmful internet content, has researched and even tested Instagram’s response to pornography on their platform. [2] Despite their “no nudity” policy, after Protect Young Eyes reported five different hashtags linked to pornography a minimum of 50 times over a five day period, no action by Instagram was taken. Explicit hashtags that consistently post graphic pornographic content were unfiltered and left untouched.
Why exactly Instagram continues to allow this is another question, one that remains unanswered. Theories mainly include that a) they are making money by keeping the explicit material on the sites, and b) they simply don’t have the staffing to efficiently respond to material flagged as inappropriate. Right now, there’s no evidence or facts to suggest either is the case.
And while we’re not here to speculate, we are here to sound the alarm and give the app’s users a serious heads up.
Diving deeper into the bigger issue
Without directly acknowledging the extent of the explicit content on the site, Instagram briefly recognizes that, like all of social media, inappropriate content is a concern and dives no deeper. Their website contains safety and privacy tips, including long guidelines for parents about how to help their kids navigate Instagram. In their “Parents Guide to Instagram,” pornography or isn’t mentioned once, and when the option of reporting inappropriate material comes up, it is typically in regards to harassment and cyberbullying.
Because Instagram is so popular among youth and our generation, the easy access to explicit pornographic content should be a public concern.
As decades of science and research have shown, porn is harmful for any consumer, young or old, but kids are an especially vulnerable age group. It’s no secret that porn consumption can shape an individual’s sexual tastes and desires, often to more violent and extreme preferences, [3] and literally changes the brain of the consumer. Pornography is even linked to trauma when boys are exposed to it at early ages. [4]
Child predators on the app
But aside from explicit content being so easily accessible on the app, there are other serious problems: child predators having unblocked access to underage adolescents’ accounts.
From our friends at the National Center on Sexual Exploitation, here are a few facts:
- According to survivor testimony, sex traffickers and child predators appear to be increasingly using Instagram to identify, groom, and exploit children.
- Minors whose Instagram accounts are set to private can still receive unsolicited direct messages from strangers, which has led to several instances of sex trafficking and child sexual abuse.
- There are countless comments by predatory adults on the photos of minors, where they leave sexually graphic comments, sexualize children, or solicit sex from children.
Just watch this video of a mom going undercover as a 12-year-old on the app, and see how quickly child predators get in contact with her.
Looking to the future
Don’t get us wrong—we love Instagram and use it every single day to spread the word on the harmful effects of porn. (Follow us! @fightthenewdrug.)
The ramifications of pornography exposure and consumption at such young ages might contribute to future generations having homogenized and often dangerous sexual tastes and preferences, as well as a perpetuation of negative gender stereotypes and toxic masculinity. When the harms of pornography and the concerns of access predators have to underage kids are examined in full, it becomes obvious that we should care about what’s going on on the world’s most popular social app.
Though Instagram doesn’t appear to be aggressively obviously tackling these issues currently, we can take steps to protect ourselves and others from the harms of pornography and predators. Getting educated and talking to friends, family, other parents, and kids will lead to a world that is more prepared to join the fight against pornography and choose love instead.