Advertisement

YouTube moderators say the site goes easy on its big stars

It's one rule for us, another for PewDiePie.

YouTube just can't seem to get out of this hole it has dug itself into. According to a Washington Post report, the video giant has let top video creators get away more lightly with problematic content than those who bring in in fewer videos. The publication spoke with 11 current and former moderators for the platform, who have worked in teams that make decisions around this content, and they expressed that popular accounts "often get special treatment in the form of looser interpretations of YouTube's guidelines prohibiting demeaning speech, bullying and other forms of graphic content."

According to the Post's sources, YouTube "made exceptions for popular creators including Logan Paul, Steven Crowder and Pew Die Pie." The publication reported that YouTube has denied those claims, "saying it enforces rules equally and tries to draw the line in the right places."

YouTube makes money from ads on high-traffic videos, and shares the revenue with their makers. When a creator breaks a rule, they might have their ads removed from their channels or uploads. Their videos could also get removed entirely. But, according to the Post report, unlike at Facebook and Twitter, YouTube's moderators don't all have the ability to delete content themselves. They have to report to "higher-ups who make the ultimate decision" as to whether a video or channel is safe to run ads on.

The moderators who spoke to the Post said "their recommendations to strip advertising from videos that violate the site's rules were frequently overruled by higher-ups within YouTube when the videos involved higher profile content creators who draw more advertising."

Logan Paul has ads restored on YouTube videos, but he's not off the hook

They also added that many of the rules are "ineffective and contradictory" anyway, characterizing them as "ad hoc decisions, constantly shifting policies and a widespread perception of arbitrary standards when it came to offensive content."

In response to the Post, a YouTube spokesperson said it has two sets of standards for conduct, with stricter rules for "creators who can benefit from advertising on their videos because they are effectively in business with YouTube." Meanwhile, general community guidelines are somewhat looser, and the division of policies for moderators is meant to make their work more specialized and efficient.

YouTube has come under fire lately for its apparently inability to limit problematic content on its platform. The company has tried, with efforts like manually reviewing a million videos for violations of its terrorism policy, banning dangerous challenges and pranks as well as tweaking its recommendation algorithms to stop suggesting conspiracy videos and to surface "quality family content". In the third quarter of 2018 alone, YouTube removed 58 million videos that violated its policies.

But it hasn't been enough. Google's own employees petitioned to ban the company from SF Pride earlier this year due to beliefs that YouTube was not doing enough to protect the LGBTQ community from abusive content. YouTubers in Europe also moved to unionize in an effort to make the company be "more transparent about its rules and decisions, especially in regard to monetization or demonetization of videos."

YouTube spokesperson Alex Joseph told Engadget that "Over the last several years we've made significant investments in the people, tools, and technology needed to live up to our responsibility." These investments include "machine learning to detect bad content at scale or expanding the teams working to combat violative content to over 10,000 people across Google," Joseph added. YouTube also conducts "a systematic review of our policies to make sure we're drawing the line in the right place."

Joseph stressed that "We apply these policies consistently, regardless of who a creator is."

Whether the accounts from the 11 moderators in the Post report paint a fair and accurate picture of the goings-on at YouTube or not, there is certainly room to question the video network's practices. It's clearly time for the platform to claim responsibility and be fully transparent about what it has been doing.

Update (at 1:54 p.m. ET, August 9th 2019): This article was updated to include a statement YouTube provided to Engadget.