Following criticism of the platform being abused by creators uploading inappropriate content, YouTube is set to increase moderator staff to over 10,000.
This number not only includes video reviewers but also engineers, lawyers, and operation teams too. On top of this, YouTube has also invested in new learning technology to work alongside human moderators to help identify and take down videos and comments that violate the platform’s policies.
In June, YouTube deployed this new technology to identify and flag violent extremist content for human moderation. In a statement from Susan Wojcicki, CEO of YouTube, YouTube claimed to have removed over 150,000 videos for violent extremism. The company also claimed that using this technology has allowed them to take down nearly 70% of violent extremist content within 8 hours of being uploaded.
— Susan Wojcicki (@SusanWojcicki) December 5, 2017
This action comes after the platform being heavily criticised online for failing to protect children from inappropriate content and allowing extremist content to be shared.
In the statement, YouTube also shared that it will have a new approach to advertising on its website. A stricter criteria will be applied when deciding which channels and videos should be eligible for advertising. YouTube claims that this is in the best interest of both advertisers and creators. This is because this new criteria aims to give advertisers peace of mind that their adverts will run on appropriate content. YouTube also aims to give creators confidence that their revenue won’t be hurt because of actions of other creators abusing the site.
- YouTube Addresses Its Child Exploitation Problem
- Vidme video streaming service to shut down before 2018
- YouTube To Restrict ‘Creepy’ Videos Aimed At Children