Several pension and investment funds that own Meta stock have just filed a lawsuit against Mark Zuckerberg and other Meta Platforms executives and directors, accusing them of failing to do enough to stop sex trafficking and child sexual exploitation on Facebook and Instagram.
The lawsuit states Meta has turned a blind eye to “systemic evidence” of criminal activity, and “the only logical inference is that the board has consciously decided to permit Meta’s platforms to promote and facilitate sex/human trafficking.”
Meta unequivocally rejected the lawsuit saying, “We prohibit human exploitation and child sexual exploitation in no uncertain terms,” and “the claims in this lawsuit mischaracterize our efforts to combat this type of activity. We aim to prevent people who seek to exploit others from using our platform.”
While Meta’s public position for years has been that child exploitation is “one of the most serious threats that we focus on,” the company has continued to face accusations that its platforms are rife with sexual misconduct.
Let’s get into the details.
Meta’s history with human trafficking
In October of 2021, the “Facebook Files” were released as part of a larger report and investigation led by the Wall Street Journal (WSJ).
Reports, including numerous internal company documents, suggest that the company hasn’t responded adequately to employee concerns over trafficking organizations’ platforms used to attract, advertise and sell women and children.
Employees state that some illegal groups or pages they flag get taken down while dozens remain inactive. Or, in some cases, only content tied to the group is removed instead of taking down the illicit group or organization itself.
A number of suits, in addition to the one just filed, seem to support the accusations of the pension and investment funds. A 2019 lawsuit was filed to the Securities and Exchange Commission (SEC) alleging that in 2019, Facebook and Instagram were aware that the individuals would use the platforms to “promote human trafficking and domestic servitude.” A 2021 case that went to the Texas Supreme Court said Meta was not a “lawless no-man’s-land” immune from liability for human trafficking.
The stats support it too. According to the Human Trafficking Institute, 79% of underage victims recruited online in 2020 were active criminal sex trafficking cases recruited through Meta.
One of social media’s defining characteristics is that it allows its users to connect and share with others, so it’s not a surprise that platforms are used for illicit and exploitative purposes, like human trafficking. Moreover, it is a fact that traffickers often take advantage of social media, using this tool as a way to locate, groom, and ultimately sell victims.
So, what are companies like Meta trying to do anything to fix this issue?
The continuous challenge of moderation
The task of content moderation has always been a challenge. However, it’s gotten more difficult with easier access to the internet, the allowance of user-uploaded content, and real-time update selling points. For example, when you make a post on Instagram, it’s made live for your friends the second you click “post,” and the allowance of free content downloads.
This issue is then magnified by the number of users posting and downloading content. Companies have to pay moderators to review the content being shared on their sites.
If there are 694 Instagram posts per second, even if a company had 694 moderators, there would need to be time spent reviewing and flagging the post. By the time the moderators would have completed reviewing and flagging their posts, another 20,000 or so posts would have occurred. Sure, these companies may use some automated tech to take care of obviously flaggable material, but human moderators have always needed to have the last say.
What’s porn got to do with Meta’s ties to trafficking
Sex trafficking affects about 5 million people globally. Given the definition of sex trafficking is a situation in which “a commercial sex act is induced by force, fraud, or coercion, or in which the person induced to perform such act has not attained 18 years of age,” the two are frequently inextricably tied.
As an example, if a porn performer shows up on set to discover that the scene is much more aggressive or degrading than they were told, and their agent threatens to cancel their other bookings if the performer doesn’t go through with it, that’s trafficking.
We have real quotes from porn performers supporting this:
- “I was threatened that if I did not do the scene, I was going to get sued for lots of money.”
- “I tried to stop the scene, but [the director] told me I was ruining the flow and to just put my head back in the frame.”
- “He told me that I had to do it, and if I can’t, he would charge me, and I would lose any other bookings I had because I would make his agency look bad.”
Moreover, it’s no secret how top porn sites are loaded with nonconsensual imagery and child sexual abuse material. Just check out the New York Times opinion column written by the Pulitzer Prize-winning journalist Nicholas Kristof for more details, which eventually led Visa and Mastercard to stop providing their services to mega porn site Pornhub to pay for the content and which led Pornhub to pull millions of unverified videos.
What can we do?
As social media platforms deal with the challenge of preventing traffickers from using their sites, each of us can play our part, too, to stop the demand for sexual exploitation.
Stay informed, and find ways to get involved in combating trafficking in your everyday life. For ways to combat human trafficking in your daily life, click here.
Support this resource
Thanks for taking the time to read through this article! As a 501(c)(3) nonprofit, we're able to create resources like this through the support of people like you. Will you help to keep our educational resources free as we produce resources that raise awareness on the harms of porn and sexual exploitation?DONATE