While the whole world is buzzing about the new release of Apple’s iPhone 7 this September 2016, and all of its incredible features, the new operating system was unrolled this past week as well. iPhone users who update to iOS 10 can now enjoy many amazing new features, including a redesigned iMessage platform that allows for new emojis, GIFs, and reactions to text messages.
One of the most highly-anticipated features of iOS 10 is a new GIF keyboard that allows you to post GIFs directly into iMessage, which pulls in GIFs from a variety of outside sources. For example:
But as with any instance of pulling images from third-party sources, there are bound to be a few inappropriate ones that slip through the cracks, and iOS 10’s GIF search is no different. Somehow, hardcore porn clips started popping up in people’s conversations.
As first reported by Deadspin, if you typed a certain word into the GIF search it would lead you to a cartoon image that was sexually suggestive. Deadspin says that Apple immediately fixed the one issue but the iOS 10 Porn problem persisted. Later, users were finding that if they searched another word in the GIF keyboard that is seemingly innocent, they found an extremely NSFW image of a hardcore sex act.
One woman told The Verge that the porn problem led to an embarrassing situation with her eight-year-old daughter trying to send a message to her dad. She was presented with “a very explicit image” and the mother “grabbed the phone from her immediately.” She added, “I typed in the word, which isn’t sexual in any nature. It’s just a word, not like butt or anything else,” she added.
The mother says her daughter is fine — “she had no idea” — but she’s concerned about the possibility of other kids being accidentally exposed to porn through what’s supposed to be a goofy feature. “My daughter uses it because there’s cartoons and fart jokes, that kind of stuff,” she told Verge. “That’s hardcore porn. People making out she might see on ABC. That’s something that could potentially be pretty traumatizing for a small child.”
These instances are bad news for Apple who has long been particularly strict with sexual content. Legendary Apple founder Steve Jobs said in an email exchange with a customer in 2011, that he believed he had a “moral responsibility” to reject pornographic content on Apple products. He famously wrote, “Folks who want porn can buy an Android.”
Jobs also defended his stance against a critique from a magazine writer who objected to an Apple commercial calling the iPad a revolution, while banning porn in the App Store. ”Revolutions are about freedom,” the journalist wrote. Jobs responded that Apple products offer users freedom from porn, and told the writer that he might care more about porn when he had children.
Apple’s App Store guidelines are very clear about pornography: Apps containing pornographic material, defined by Webster’s Dictionary as “explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings”, will be rejected.
Apps that contain user generated content that is frequently pornographic (ex. “Chat Roulette” Apps) will be rejected.
Apple has yet to publicly respond to the iOS 10 porn search issue but they did get to work quickly to solve the problem. It fixed the first search issue within 10 hours and the more explicit term sooner after.
Normalize not watching porn
In a society where porn has become so normalized and mainstream, it is so easy for hardcore porn to pop up when we are least expecting. This may seem harmless or even funny to some who don’t realize the harmful effects of porn, but the long-term effects of an exposure to hardcore porn, especially on children, can be very damaging. The majority of people who message us for help with their struggle with pornography tell us a story of how they were first exposed between the ages of 8-12. It is usually by accident and it is almost aways traumatizing.
These shocking images to a young mind usually lead them back for more and down a road that can turn into a lifelong addiction. While Apple is not entirely at fault for third-party images, this issue shows a much larger problem in our society that we need to speak up about.