Get this: Google owns 92.19% of the search engine market share worldwide and enjoys 5.6 billion searches per day.
It uses PageRank algorithms to determine the results of any given search. Essentially, these algorithms do their best to figure out what the search engine user wants to see and/or know based on the words that have been typed into the search bar. Then, they provide the user with an ordered and seemingly endless list of corresponding links.
For most of us, that’s a positive thing.
Google’s search engine literally puts the world at our fingertips. It harnesses the internet’s vast power and turns it into content that is readily available for us to do what we’d like with it.
In mere seconds, a quick search may help provide you with the upcoming delivery of a Chipotle burrito with extra guac, the ability to communicate with your British cousin by way of social media, access to information about the creation of Fortnite for that school paper that’s due tomorrow morning, among just about anything else in the world you could want or think of.
For some of us, however, that seemingly endless list of ordered search-related links is actually a nightmare.
That’s because not all content that Google’s search engine helps an internet user access is harmless—especially when it comes to revenge porn and other forms of image-based abuse.
What do we mean? Let us explain.
Search engine results that can ruin lives
In 2016, a Brooklyn attorney who fights for victims of online harassment, sexual assault, and blackmail, Carrie Goldberg, was left pleading with Google’s “Legal Removals” team to remove nonconsensual sex videos of 15 female clients. They are all aged 18 to 22, and these videos were shared hundreds of thousands of times on popular porn sites with their full legal names listed alongside the videos, in some of the cases.
Related: Hundreds Of Women Who Agreed To Model Swimsuits Were Forced To Perform In Porn, Lawsuit Alleges
Ms. Goldberg’s clients had all responded to deceptive bikini model ads and had been coerced to perform in porn that resulted in some of them being raped before and during the shoots.
In a day and age when a quick Google search is commonplace prior to hiring a new employee, going on a first date, accepting a new roommate, and so much more, it doesn’t take a lot to imagine how Google’s PageRank algorithms might be playing a much larger role in these survivors’ lives than they should be.
In other words, Google searching any of the 15 women’s names would lead the PageRank algorithms to bring up numerous nonconsensual porn images and video links. How is this at all helpful?
Now, nonconsensually shared explicit images and videos of any kind are unacceptable and criminal enough. However, when you add the element of filmed sexual assault to the equation, unacceptable becomes horrific.
According to Ms. Goldberg, as these women’s degradation was exposed alongside their real names, the harassment that followed was unthinkable. Many were dumped, kicked out of their social organizations, fired from their jobs, forced to move, and some even had to change their names.
The case of a 20-year-old college junior who flew to San Diego for what was supposed to be bikini shoot was no exception. This is what the young woman told Ms. Goldberg:
“I had to change my major and career choice. I lost a lot of friends, and my family wouldn’t talk to me for a while. I have had nightmares about it and dreams that it never happened or that it was somehow erased completely. It was one of the worst things I have ever done and not just because it showed up on the Internet. I wish I could erase that memory out of my mind forever. I was spit on while making the video, and I have never felt more disrespected or devalued in my entire life.”
The story for another one of Ms. Goldberg’s clients was even more violent: 18-year-old “Anna” was violently gang-raped by three older men who videoed their exploits. It didn’t take long for the video to populate the first five pages of her search-engine results, which led to Anna being stalked, harassed, and receiving death threats. Even after moving and changing her name and her family’s names, the video followed her due to a viewer finding her new name and posting it online.
Ms. Goldberg claims that affidavits—or voluntarily given sworn statements of fact—containing her clients’ heartbreaking stories like the above weren’t enough to sway Google to remove the explicit content.
Why Google didn’t remove the revenge porn results
Google’s current policies only permit no-questions-asked content removal in cases where there is copyright infringement or child exploitation images, also known as child porn. However, they may be willing to remove nonconsensual nude or sexually explicit images—but the company gets to decide on a case by case basis. So, sometimes they’ll remove the search results, according to this Google Help forum, but sometimes they won’t.
So, what do you do if the company that can determine your future fate decides not to remove the images?
Ms. Goldberg says you can’t do anything; you’re flat out of luck because the laws protecting revenge porn survivors are centered on punishing the individual who nonconsensually shares the explicit imagery, not the search engine that allows the link to remain active.
Related: Tech Companies Reported Over 45 Million Child Porn Images & Videos On Their Platforms Last Year
And, even in the few cases where Google does happen to remove the revenge porn link, porn sites have found an easy workaround: they simply change the URL.
Even when the URL address, images, porn titles with a clients’ name in them remains the same as the content the original URL gave you, Google decides not to suppress new URLs.
Why this matters
These women’s lives—though revenge porn doesn’t exclusively exploit women—have been seriously affected by the nonconsensually taken explicit images and videos that have been posted of them on the internet for the entertainment of the masses.
Since the data shows that revenge porn survivors suffer similar trauma as sexual assault survivors, these women have been through it doubly.
Yet, the hundreds of thousands of views on these nonconsensual online images and videos seem to suggest a lack of awareness of the realities of revenge porn and sexual assault on the part of the consumer.
What it comes down to is this: revenge porn and sexual assault are not acceptable, and yet, these images can be indistinguishable from mainstream porn. Consumers often have no clue they’re contributing to the humiliation and degradation of someone who never chose to be sexually exploited in the first place—or, what’s more, some seek out “real” images they know were uploaded without consent for the extra thrill.
It is for the survivors, like Ms. Goldberg’s clients, that we fight. Will you join us?
If you are a victim of revenge porn, there are steps you can take to reclaim your life and images. Click here for suggestions on where to start.