This Is the Disturbing Content That’s Slipping Through YouTube’s Kid Filters and Showing up on Children’s Screens

Share on Facebook

A recent article in The New York Times has alerted parents everywhere to a problem that many had no idea about. Inappropriate videos, some deeply disturbing, are slipping through content filters and showing up on kids’ tablets in the YouTube Kids app — an app parents trust to filter out such content.

Waiting rooms at the pediatrician’s office, long-distance drives, and rainy afternoons are a little easier when we can hand over a tablet full of games and apps to entertain our little ones. The makers of these apps have always said that their use should be under close parental supervision, but let’s be real: if a kid is sitting quietly with a tablet, we can finally do something else for a few minutes. Most of the time, we’re not sitting beside our children, watching inane videos of people opening toys or playing video games.

There’s no question that kids watch and love inane videos. Parents don’t have to understand it to appreciate those rare minutes of quiet, occupied children. Author’s aside: I have heard stampy’s voice in the background for years, but I’ve never actually seen one of his videos. So, yeah, we’re not sitting beside our children watching this stuff with them. And that’s why there’s a problem.

It turns out, the algorithms for determining a video’s content are extremely complicated, and that lets some disturbing stuff slip through the cracks. Videos that feature beloved children’s characters in disturbing, violent, and even sexual situations are mixed in with the millions of other perfectly fine videos on the platform. The disturbing videos range from odd to deeply twisted.  

In this video, which has recently been removed from YouTube, containers of bleach are added into an otherwise normal Peppa Pig (a popular British cartoon character) scene. A deep computerized voice interjects comments and words as the scene plays out, like, “It was a boring ass day…”  

Another video that has been removed in the last couple of days features a creepy flasher who introduces Peppa to bacon. She eventually eats her father and is seen at the end peeling her own arm with a vegetable peeler to get more bacon. One could argue that these videos are satire, but for young children, they could be extremely disturbing. Michael Rich, a Harvard Medical School pediatrics professor and director of the Center on Media and Child Health, explained that videos like these are especially disturbing for children. “It’s just made that much more upsetting by the fact that characters they thought they knew and trusted are behaving in these ways,” he said.  

Complicated algorithms are used to determine whether a video uploaded on YouTube is appropriate for YouTube Kids. According to Malik Ducard, the global head of YouTube’s family and learning content, the videos are monitored continually by computers. He describes the ongoing monitoring as “multilayered” and involving “a lot of machine learning.”  

If a parent happens upon an inappropriate video, like this blood splattered version of Peppa Pig going to the dentist, and they report it to YouTube, someone (an actual person) manually reviews it and will remove it if it is deemed offensive. Ducard said that “less than .005 percent” of videos viewed in YouTube Kids were removed for being inappropriate in the last 30 days.


In their advertising they explain: “We’ve worked hard to make our app family-friendly, including the addition of new features to help manage your kids’ online experience.” The app provides parental controls such as the ability to block certain videos or makers and turn off the user’s access to the search feature, restricting the videos a young user might stumble upon.
undefined


The theory is that (some of) these weren’t made by people, but randomly churned out by computers mixing together common and popular characters, random graphics, and music. It’s sort of like the Infinite Monkey Theorem — you know the one: a monkey randomly hitting keys on a typewriter, if given infinity amount of time, would eventually create the complete works of Shakespeare. But here it’s computers instead of monkeys, and it takes much less time, and instead of producing Shakespeare, it’s — in one case — animated psycho-clowns torturing Spider-Man, The Hulk, Elsa, and some purple Spider-Girl character. I don’t recommend watching this video, but if you want your eyes and ears to bleed, here it is: seriously, don’t watch this.

YouTube Kids is working on improving its video approval systems and it encourages parents to report any offensive content. It seems that this disturbing content is another symptom of our broader, societal internet problem. With bad actors employing bots and other nefarious means with the purpose of disrupting and disturbing us, how can we protect ourselves from their influence?

In a long, detailed article on this topic for Medium, James Bridle wrote: “The system is complicit in the abuse.
And right now, right here, YouTube and Google are complicit in that system. The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale. I believe they have an absolute responsibility to deal with this.”
undefined