Amid new reports and controversy, YouTube has laid out its plan for tackling child exploitation on the platform.
In recent weeks, YouTube has faced controversy and criticism for inappropriate videos featuring and targeted at children. On 6 November, James Bridle published a blog post in which he described the many “disturbing” videos targeted at children, and explained how they were slipping through the cracks of YouTube’s content filtering system. YouTube then announced plans to restrict these kinds of “creepy” videos and remove them from the YouTube Kids app.
Following this, more issues were exposed. On 22 November, BuzzFeed News published an article (warning: contains disturbing and graphic content) about the many videos, some from verified accounts, that feature children in abusive and predatory situations. Furthermore, volunteer moderators told the BBC that there are between 50,000 and 100,000 predatory accounts leaving inappropriate comments on these videos. In response to concerns that their adverts are appearing alongside abusive content, several majors companies reportedly pulled their advertisements from YouTube on the eve of Black Friday.
Vice President of Product Management at YouTube Johanna Wright shared the platform’s plans to address these issues on YouTube’s official blog. YouTube renewed its efforts to restrict adult videos to adults only and remove illegal and predatory content from the site altogether; since it started enforcing Community Guidelines more strictly it has terminated more than 270 accounts and removed over 150,000 videos from the platform, according to Vice News.
However, these were not the last issues to emerge. On 27 November, BuzzFeed released an article highlighting the fact that certain searches on the platform were autocompleting with disturbing and exploitative results. About this latest concern, a YouTube spokesperson said: “Earlier today our teams were alerted to this awful autocomplete result and we worked to quickly remove it. We are investigating this matter to determine what was behind the appearance of this autocompletion.”
In Johanna Wright’s blog, she stated various ways in which YouTube will be tackling exploitative content in the future. In order to combat predatory comments, YouTube is more strictly enforcing their content flagging systems, turning off comments altogether on videos featuring minors where inappropriate comments appear. YouTube will also remove all advertisements from content that depicts family-friendly characters in non-child-friendly contexts. Ads have reportedly been removed from two million videos and over 50,000 channels that featured disturbing content aimed at younger viewers.
Some creators have expressed concern about how the platform changes will affect other people who get caught by the demonetisation update. Tom ‘TomSka’ Ridgewell spoke about the issue in his Last Week series, saying: “Obviously, it’s good that child predators and all this really seedy shit on YouTube is getting demonetised. The problem, of course, is that it’s getting loads of other people looped in, and loads of other people are getting demonetised for anything from being channels that are talking about Sex Education or bullying, or living with autism.” He suggested that YouTube should have a “white-listing” system that allows verified channels to be automatically monetised until they violate the rules.
To proactively work to prevent inappropriate content from being created, YouTube will be releasing a guide to making family content for YouTube Kids. It will also be partnering with more experts to consult about these problems, and doubling the number of Trusted Flaggers to report issues.
The NSPCC, however, are concerned that these precautions still aren’t enough, and that government action is needed to prevent further exploitative content appearing on the platform. Andy Burrows, the NSPCC’s Associate Head of Child Safety Online, said: “Reports that YouTube is failing to act on child protection issues are extremely disappointing and demonstrate exactly why we need government to step in.”
He added that “a set of rules enshrined in law” is needed with an “independent regulator to enforce those rules”.
Want more?
Read about YouTube’s work towards restricting inappropriate videos, or creators’ thoughts on how stricter monetisation rules earlier in 2017 affected their channels.
For updates follow @TenEightyUK on Twitter or like TenEighty UK on Facebook.