Huge quantities of unsuitable content available for young children to view on YouTube.
Tech-focused writer James Bridle wrote a Medium post on 6 November that documented how the algorithmic system used by YouTube directs massive amounts of viewers to content made to fit the requirements of the algorithm exactly to reap ad revenue – with some unsavoury results.
He explained: “Someone or something, or some combination of people and things, is using YouTube to systematically frighten, traumatise and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.”
Bridle cited content such as pirated versions of the popular Peppa Pig children’s show. In one such episode, titled Peppa Pig Crying at the Dentist Doctor Pull Teeth, “she is basically tortured, before turning into a series of Iron Man robots and performing the Learn Colours dance”.
YouTube Kids app under scrutiny
A report that appeared in The New York Times also tackled the issue in terms of YouTube’s official children’s app, YouTube Kids, with a parent discussing an incident where their three-year-old watched a pirated version of the popular television show Paw Patrol.
“Isaac was watching a video featuring crude renderings of the characters from Paw Patrol, a Nickelodeon show that is popular among preschoolers, screaming in a car. The vehicle hurtled into a light pole and burst into flames.”
Bridle explained that the targeting of pre-schoolers with this content would traumatise and disturb them, describing networks such as YouTube as “extremely vulnerable to this kind of abuse”.
He noted that although many of the more disturbing videos were not available on YouTube Kids, some of the major children’s channels are still producing concerning content. One such example is the Toy Freaks channel, which depicts “activities which many, many viewers feel border on abuse and exploitation, if not cross the line entirely, including videos of the children vomiting and in pain”.
Bridle also expressed his concern at the impossibility of determining the degree of automation at work in the creation of these videos.
Algorithms no substitute for human intervention
Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, argued that videos slipping through the cracks in the YouTube Kids app shows the dangers of today’s media landscape for young people.
“Algorithms are not a substitute for human intervention and, when it comes to creating a safe environment for children, you need humans.”
In a statement, YouTube said that its site was explicitly for those aged 13 and up, and it also said it had updated its advertising rules “to clearly indicate that videos depicting family entertainment characters engaged in inappropriate behaviour are not eligible for advertising on YouTube”. It hopes that demonetising these kinds of videos will limit the motivations to make more.
Spokespeople for Disney and Nickelodeon both said they were working with YouTube Kids to “to more effectively and proactively prevent this type of situation from occurring”.
More oversight needed
Polygon’s Julia Alexander wrote that it boils down to the same problems faced by Twitter and Facebook: a lack of content management and oversight.
“They don’t want to be responsible for the dark forces using their platforms to spread harmful messages or disturbing content. They’re also too big to be able to vet every single tweet, Facebook post or video.”
Bridle warned: “What concerns me is that this is just one aspect of a kind of infrastructural violence being done to all of us, all of the time, and we’re still struggling to find a way to even talk about it, to describe its mechanisms and its actions and its effects.”
All reports point to the work that YouTube and other tech giants must undertake in order to address the problems that their own systems have helped to create, and continue to foster.