YouTube is planning to add more human moderators to its manual review content team. They also hope to increase use of machine learning to cut down on content that violates its video-sharing policies. In a blog post published Monday evening, YouTube CEO Susan Wojcicki said that Google will be increasing the number of content moderators, and other employees reviewing content and training algorithms to more than 10,000 in 2018.
YouTube has faced increased criticism on how it handled violent extremist content, among other things. Wojciciki said that they’re taking the lessons learned in the past year and are starting to increase members of their team to solidify a better process. “Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube,” Wojcicki wrote in her blog post.
Advertisers started to boycott the Google-owned video sharing site over videos with children that were the target of sexually inappropriate content that was being shown. YouTube did take action by terminating hundreds of accounts, removing over 150,000 videos from its platform, and even turned off comments on more than 625,000 videos. These videos were targeted by alleged child predators.
Aside from adding more members to its content review team, the company will also focus on training its machine-learning algorithm to assist human reviewers in identifying, terminating, and removing comments that violate the site’s rules.
YouTube’s team is now removing five times more videos than they use to, thanks to machine learning, Wojciciki said. She also said that YouTube is now taking a “new approach” to advertising on the website, taking into account which channels and videos should be eligible for advertising. “We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should,” she wrote.