Advertisement

Google will enlist 10,000 employees to moderate YouTube videos

The video platform will also have stricter criteria on which channels can earn from ads.

YouTube had its hands full lately, dealing with disturbing channels and videos masquerading as family-friendly offerings. Now, YouTube chief Susan Wojcicki has explained how the platform plans to keep a closer eye on the videos it hosts going forward by applying the lessons it learned fighting violent extremism content. Wojcicki says the company has begun training its algorithms to improve child safety on the platform and to be better at detecting hate speech. To be able to teach its algorithms which videos need to be removed and which can stay, though, it needs more people's help. That's why it aims to appoint as many as 10,000 people across Google to review content that might violate its policies.

YouTube says its machine-learning algorithms help take down 70 percent of violent extremist content within eight hours of upload. By training those algorithms to do the same for other types of videos, such as those questionable uploads that targeted children, the platform will be able to take them down a lot faster than it currently can.

In addition to getting 10,000 Google employees' help, YouTube also plans to conjure up stricter criteria to consult when deciding which channels are eligible for advertising. At the moment, creators need at least 10,000 views to be able to earn ad money, but it sounds like the platform will also expand its team of reviewers to vet channels and videos and "ensure ads are only running where they should."

Finally, YouTube promises to be a lot more transparent. In 2018, it'll start publishing reports containing data on the flags it gets, along with the the actions it takes to remove any video and comment that violates its policies.