Skip to main content
Blog

YouTube Plans Sweeping Changes to Kids Videos After $170 Million Fine

The new policy extends to includes any content with, “sexual themes, violence, obscene, or other mature themes not suitable for young audiences."

By September 25, 2019No Comments
YouTubeWakeUp-Child-exploiters-use-video-platform-youtube

Let’s set the day back to December 7, 2017. In an open letter to kids and family content creators, YouTube’s Global Head of Family and Learning, Malik Ducard, said:

“Content masquerading as family-friendly on YouTube, a small amount of which has appeared in YouTube Kids, is an issue that we have been deeply focused on. Let me be clear in stating that content that deceives or harms families is absolutely unacceptable and to combat this content we needed to take significant action. We have clear policies against these videos and we enforce them aggressively.”

A little over a year and a half later, these efforts to enforce kid-friendly content have been seen in a new, not widely advertised policy YouTube has adopted in an effort to protect one of “the biggest drivers” of its business in recent years: kids.

BHW - General

What exactly is in this new policy?

Made public on the YouTube Help Community Forum, and only receiving 70 replies, the new policy states that it will remove any content that has, “mature or violent themes” and which, “explicitly targets younger minors and families,” in its title, description link, or associated tags. This is a big deal.

Related: “It’s Not As Free As People Think It Is”: YouTuber Chaz Smith On Why He Joined The Porn Kills Love Movement

Of course, porn wasn’t allowed under its existing policy, but the new policy extends to includes any content with, “sexual themes, violence, obscene, or other mature themes not suitable for young audiences,” especially that which is designed to mislead.

The policy goes on to list some examples of content that would be removed, such as content listed as “for kids” or with nursery rhymes that address topics like sex, violence, or death.

Store - Consciously Created

How is this a change?

YouTube used to age-restrict this type of content previously. Now, age-restricted content will still exist, but to avoid possible removal, it must be made clear that it is, “meant for adults” and it is encouraged to have “titles, descriptions, and tags match the audience,” intended, especially for content like adult cartoons that could be easily confused as kid-friendly (aka that “misleading content” idea again).

However, the policy has not yet taken full effect.

Related: YouTube Star Austin Jones Sentenced To 10 Years In Prison For Soliciting Explicit Images From Underage Fans

YouTube has given content creators 30 days to get on board with the update, and will not strike channels found to be in violation or videos created prior to the policy, but, it will remove any content, old or new, that doesn’t meet the standards, effective immediately.

They’ve made the full list of new policy guidelines available on their Help Center.

Store - General

What brought this on?

In the last few years, you may have heard news of YouTube struggling to keep its supposedly kid-focused content “kid-friendly.”

A range of issues have littered the media platform: popular webcamming sites being linked as ads on children’s videos, bizarre and twisted variations of kid-focused content involving Disney or Marvel characters, and predatory comments on videos featuring minors. The most serious critique that has landed YouTube in the Federal Trade Commission’s (FTC) case list is its algorithm, which reportedly does not account for content’s nature, and can lead children to exploitive, violent, and extremist material.

Related: What’s Up With The Bizarre, Explicit Videos Infiltrating “Kid-Friendly” YouTube Channels?

Some believe YouTube has not done enough to protect its underage users. On the one hand, the platform started refused to stop recommending videos featuring children despite knowing about countless predatory comments that sexualized the featured children. It’s important to note that videos with kids perform very well on YouTube, and the algorithm favors tags related to kids, and readily recommends kid-featured videos even to predatory users.

On the other hand, the video giant has attempted to better the situation by creating YouTube Kids in 2015 with the idea of specifically creating a safe zone for kids and family-friendly content. This has not been without issues, though.

Related: #YouTubeWakeUp: What You Need To Know About The Child Exploitation Crisis On YouTube

Less than two months after launch, the Campaign for a Commercial-Free Childhood, a mix of child and advocacy groups, reported YouTube to the FTC for content that was “not only…disturbing for young children to view, but potentially harmful.” There has been pressure from within the government as well for YouTube to, “take the necessary steps to protect its youngest users.” Mostly this pushback was due to the types of videos we mentioned earlier: twisted versions of Disney or Marvel characters engaging in sexual or violent behaviors.

Despite the challenges, it seems in the last few months there have been slight improvements leading to last month’s new policy. The new CEO Susan Wojcicki is supposed to be cracking down further on issues related to child security, and this new policy is just one sign of that.

Conversation Blueprint

Will kids be safe on YouTube, now?

It is important to note that YouTube is still undecided as to whether or not it will remove the recommendation algorithms that keep leading child predators to minor-featured content. Despite promises to disable comments on certain videos or remove targeted advertising on videos featuring children, YouTube was recently fined $170 million for violating the Children’s Online Privacy Protection Act (COPPA) Rule. Tech news website CNET found that YouTube was not upholding its promise to disable comments from some videos featuring young minors. Not cool.

Related: YouTube Removes Explicit Webcam Site Ads From Kids’ Videos

In 2017, YouTube confirmed they’d eliminated 150,000 videos and 270 accounts, as well as disabled comments for well over a half a million videos. Advertising was removed from almost two million videos, and 50,000 channels that were posing as family-friendly content.

They have done some work despite calls to further protect and provide appropriate content for children, and hopefully this newest policy is that start of a more effective strategy to keep kids safe on their platform. Only time will tell.