Facebook has taken steps to combat porn and exploitive images on their platform (of which there is a ton), but it seems as though they’re also taking steps backward in terms of truly proving that they’re adequately addressing inappropriate content.

Earlier this week, Guardian editor Jonathan Hayes tweeted pics of a Facebook survey that showed up on his feed about a hypothetical child exploitation imagery situation. It was probably intended as a way to gauge what consumers think about child exploitation imagery (also known as child porn), and given recent unfortunate events of child exploitation photos going viral on the Facebook messaging platform, what they’d do in a situation that involved seeing it on their feeds.

The first question stated: “There are a wide range of topics and behaviors that appear on Facebook. In thinking about an ideal world where you could set Facebook’s policies, how would you handle the following: a private message in which an adult man asks a 14-year-old girl for sexual pictures.

The response options were:

  • This content should be allowed on Facebook, and I would not mind seeing it
  • This content should be allowed on Facebook, but I don’t want to see it
  • This content should not be allowed on Facebook, and no one should be able to see it
  • I have no preference on this topic

(There seems to be a missing answer option, right?)

The second question: “When thinking about the rules for deciding whether a private message in which an adult man asks a 14-year-old girl for sexual pictures should or should not be allowed on Facebook, ideally who do you think should be deciding the rules?”

The response options:

  • Facebook decides the rules on its own
  • Facebook decides the rules with advice from external experts
  • External experts decide the rules and tell Facebook
  • Facebook users decide the rules by voting and tell Facebook
  • I have no preference

(Again, isn’t there a missing option?)

Yikes. This is a huge misstep for Facebook, especially given the recent hot water they’ve been in for not being able to completely stop child porn or revenge porn images from being shared on the site.

It’s unclear how long this survey had been running and how many responses it received, but what is clear is that Facebook missed listing a very important option: contacting the law enforcement.

Related: Exposing The Serious Porn Problem On Popular Social Media Platforms

Facebook VP of Product Guy Rosen chimed in on Haynes’ tweet thread and admitted the survey was a mistake.

“We run surveys to understand how the community thinks about how we set policies. But this kind of activity is and will always be completely unacceptable on FB,” Rosen wrote. “We regularly work with authorities if identified. It should have been part of this survey. That was a mistake.”

He did not address why Facebook did not include the option to inform relevant authorities in the survey.

Related14-Year-Old Girl Sues Facebook For Failing To Remove Revenge Porn

“We understand this survey refers to offensive content that is already prohibited on Facebook and that we have no intention of allowing so have stopped the survey,” another Facebook spokesperson said in a statement. “We have prohibited child grooming on Facebook since our earliest days, we have no intention of changing this, and we regularly work with the police to ensure that anyone found acting in such a way is brought to justice.”

Revenge and Child Porn on Facebook

Facebook has over 2 billion active users. It’s no surprise that there are issues to deal with, considering the number of issues a site can run into with that many users. Even so, the social media giant hasn’t had a great track record when it comes to addressing sexual exploitation.

As part of a slow but steady file leak, the Guardian revealed that Facebook has faced at least one recent surge in revenge porn and sexual extortion cases—54,000 potential cases just in January of last year. The company ended up disabling over 14,000 accounts involved in these disputes, 33 of which involved children. It’s not clear how this compares to other periods (Facebook doesn’t divulge specific figures), but that’s no small amount.

In addition, Facebook escalated 2,450 cases of potential sextortion—which it defines as attempts to extort money, or other imagery, from an individual. This led to a total of 14,130 accounts being disabled. Sixteen cases were taken on by Facebook’s internal investigations teams.

And those are just the highlights when it comes to the porn problems on this world’s most massive social site. Clearly, they’ve got some work to do.

Get Involved

These data show us just how much porn has taken over the internet and our online social experiences, especially for teens. It’s no secret that porn is everywhere, and now it seems to have taken hold of our timelines. Not cool.

We fight because we believe society can do better than constantly fueling the demand for this content, and we believe social media sites can do better than to let that happen. How can we fight back? It seems small, but keep reporting, keep blocking, and keep these sites accountable for their content. Together, our voices are loud.

Spark Conversations

This movement is all about changing the conversation about pornography. When you rep a tee, you can spark meaningful conversation on porn’s harms and inspire lasting change in individuals’ lives, and our world. Are you in? Check out all our styles in our online store, or click below to shop:

Send this to a friend