Following criticism over inappropriate content uploaded to the platform, YouTube is set to increase its moderator staff to over 10,000.
This number not only includes video reviewers but also engineers, lawyers, and operation teams. On top of this, YouTube has also invested in new learning technology to work alongside human moderators to help identify and take down videos and comments that violate the platform’s policies.
In June, YouTube deployed this new technology to identify and flag violent extremist content for human moderation. In a statement from Susan Wojcicki, CEO of YouTube, YouTube claimed to have removed over 150,000 videos for violent extremism. The company also claimed that using this technology has allowed them to take down nearly 70% of violent extremist content within eight hours of being uploaded.
Sharing an important update about how we're expanding our work against abuse of @YouTube: https://t.co/DNbq8L5ueU
— Susan Wojcicki (@SusanWojcicki) December 5, 2017
This statement comes after the platform was heavily criticised online for failing to protect children from inappropriate content and allowing extremist content to be shared.
In the statement, YouTube also shared that it will have a new approach to advertising. Stricter criteria will be applied when deciding which channels and videos should be eligible for advertising, which YouTube says is in the best interests of both advertisers and creators: advertisers will have peace of mind that their adverts will run on appropriate content, while creators will have confidence that their revenue will not be hurt as a result of other creators abusing the site.
Want more?
Read about YouTube addressing its child exploitation problem, or the rival Vidme service shutting down.
Follow @TenEightyUK on Twitter for updates or like TenEighty UK on Facebook.