"Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status," it said in a blog post. It also prohibits videos that promote violence or hatred against people or groups based on disability, nationality or immigration status as well as "victims of a major violent event and their kin."
YouTube didn't give any examples of the channels or videos that would be removed. The policy update comes as YouTube is embroiled in yet another controversy about the prevalence of hate speech on its platform. It refused to pull videos posted by a right-wing commentator accused of attacking a journalist with homophobic and racist language. YouTube has declined to comment on the record regarding that specific situation.
In January, YouTube updated its systems in the US to limit "borderline" content and conspiracy videos from appearing in recommendations, including flat-Earth claims and faux miracle cures for serious illnesses. The number of such videos that have appeared in recommendations since then have dropped by 50 percent, YouTube says, and it plans to expand this updated recommendation system worldwide this year.
Content moderation is unquestionably a complex issue for YouTube and other social media platforms. The volume of videos uploaded to YouTube (around 500 hours worth every minute) means it's tough for the service to hunt down all violators. Spelling out more types of videos that are no longer allowed on the platform might dissuade people from posting them, but YouTube's team will have its work cut out to monitor and remove them as soon as possible.