YouTube's Creepy Kid Problem Was Worse Than We Thought

We may earn a commission from links on this page.

YouTube says that it’s removed ads from some 2 million videos and over 50,000 channels that featured disturbing content aimed at kids. Some of that content actually exploited children in videos. And while we’ve long known that YouTube struggles to keep bad stuff off of its platform, the fact that tens of thousands of channels involved doing bad things to children feels chilling.

The public outcry over YouTube’s creepy kid videos started a few weeks ago. Several reports highlighted seemingly kid-friendly videos that depicted scenes of Disney characters in bikinis flirting with other popular children’s characters and other generally inappropriate themes. The issue was compounded by disturbing videos of cartoon characters dealing with themes like torture and suicide popping up in the YouTube Kids app. There was also a rash of videos that showed kids being tied up, apparently hurt, or otherwise engaged in exploitative situations.

Advertisement

YouTube quickly addressed the issue by announcing plans to age-restrict and demonetize these kinds of videos. The company went beyond that and told Vice News that it “terminated more than 270 accounts and removed over 150,000 videos” as well as “turned off comments on over 625,000 videos targeted by child predators.” Additionally, YouTube says it “removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content.”

Advertisement

News of YouTube’s action against millions of disturbing videos came on the heels of a separate but related controversy. Over the weekend, a number of people reported that typing “how to have” into the YouTube search bar prompted autofill suggestions that include phrases like “how to have s*x with kids” and “how to have s*x in school.” Those searches led to videos with titles like “Inappropriate Games You Played as a Kid” featuring a provocative thumbnail of young people kissing and “School Is Hard and So Is Your Math Teacher” with an image of a crying girl being touched by an older man. Those videos have 23 million and 117 million views, respectively, so it’s not hard to imagine why they showed up at the top of the search results.

Advertisement

YouTube says it has removed these vile suggested searches, which is good. But the persistence of these kinds of problems raises larger questions about YouTube’s capacity for moderating creepy content. I’m not talking about “spookin in the wrong neighborhood” or any other fun but weird but also slightly dark videos. I’m talking about the stuff that targets kids and appeals to bad people, like pedophiles.

The problem with all of these videos is how borderline they appear to be—even to humans. And yet, YouTube primarily depends on algorithms and filters to keep bad content off its platform. Videos and channels are removed by human moderators, but only after they’re flagged by users. In the meantime, you have disturbing autofill suggestions like “how to have s*x in school” showing up for everyone, as well as the countless questionable videos to which these searches lead. Removing all of these suggestions and videos seems like an impossible task, especially since over 400 hours of content are uploaded to YouTube every minute.

Advertisement

The thing is, algorithms are inherently imperfect. Computers have a hard time identifying the uncanny valley that separates an innocuous video from one that’s entirely inappropriate. Sure, videos that violate YouTube’s terms of service—stuff that’s copyrighted, gruesome, illegal, full of nudity, exploits children, and so on—can get flagged and removed. YouTube also uses algorithmic filters now in order to catch some of these videos before they’re published. The system isn’t perfect, but YouTube seems committed to it.

It’s inevitably hard to point fingers. Is it a tech company’s fault that humans are awful and abusive and exploitative? Of course not. YouTube does shoulder a tremendous burden when it comes to deciding how to let the right videos in and keep the bad ones out. Few are surprised when the platform fails to catch every creepy video. But the creepy videos are still a problem. Whether that says more about YouTube’s limitations or our own perversions remains to be seen.

Advertisement