The debate surrounding YouTube's responsibilities about the content it hosts has reignited after yet more controversy. The Google-owned video service whiffed when initially asked to discipline a creator clearly violating its policies on acceptable behavior. And, after several clarifications and shifts in policy, there are still questions surrounding how it administers its own rules.
Why are we talking about this?
When aren't we talking about how social media seems to amplify bad actors, and refuses to punish them when called out?
Why are we talking about this right now?
Carlos Maza is the host of Vox's YouTube series Strikethrough, which covers how the media operates in the modern world. Maza has come into conflict with those figures that he reports on, including those on the far right. Those individuals have often been given a platform, or enhanced their notoriety, thanks to the ability for anyone on YouTube to build a following.
Maza tweeted that he had been the subject of homophobic abuse and doxxing, which he said stemmed from one YouTuber in particular. Steven Crowder, who hosts a late-night talk-show format on the platform, has often been derogatory about Maza. That included calling the journalist a "lispy queer," "gay mexican" and "token Vox gay atheist sprit." Maza presented the highlights reel to YouTube, and asked what action it would take.
I have spent two years getting targeted by racist and homophobic abuse from one of @YouTube's star creators.
YouTube's Community Guidelines state that the use of racist or homophobic slurs in this fashion would lead to a warning and the video's removal. YouTube also adds that users should not seek to "dehumanize" others by comparing them to to non-human entities. After three warnings, a channel that continues to break YouTube's rules will be "terminated."
Hate speech is not allowed on YouTube. We remove content promoting violence or hatred against individuals or groups based on any of the following attributes:
Victims of a major violent event and their kin
If you see content that violates this policy, please report it. Instructions for reporting violations of our Community Guidelines are available here. If you have found multiple videos, comments, or a user's entire channel that you wish to report, please visit our reporting tool, where you will be able to submit a more detailed complaint.
Don't post content on YouTube if the purpose of that content is to do one or more of the following.
Encourage violence against individuals or groups based on any of on the attributes noted above. We don't allow threats on YouTube, and we treat implied calls for violence as real threats. You can learn more about our policies on threats and harassment.
Incite hatred against individuals or groups based on any of the attributes noted above.
Dehumanizing individuals or groups by calling them subhuman, comparing them to animals, insects, pests, disease, or any other non-human entity.
Context is key, but it's hard to see how Crowder's language wouldn't explicitly contravene YouTube's rules. And, because Maza has been the target of Crowder's ire for so long, he has received harassment from other people. This includes people publishing his phone number online and the creation of a homophobic t-shirt in his image.
Don't all online figures get rude comments?
Certainly, YouTube is not the only community that has a problem with a toxic, sometimes abusive discourse. The result of Crowder's regular videos on Maza is that the journalist was doxxed and attacked. Whatever your politics, it's clear that acts like this are less about furthering a debate and more about chilling speech. And YouTube has enabled Crowder to build a following, and earn a living, from what he does, with nearly 3.9 million subscribers to boot.
YouTube acted promptly to stop this, right?
Alas, the site has failed to manage the situation to anyone's satisfaction. After Maza posted the insults reel, YouTube posted a four-tweet response. Nameless officials said that while it took "allegations of harassment very seriously," the videos, while "clearly hurtful," did not violate YouTube policy. It added that while it understood that Crowder's opinions could be "deeply offensive," the clips would stay up.
Gizmodo then published a more detailed response from YouTube, saying that, because Crowder never instructed his viewers to harass Maza, he was blameless. Instead, as a response to each episode of Strikethrough, they were part of a robust online debate, not a campaign of abuse. Plus, because Crowder hadn't published Maza's phone number online, he had no responsibility there, either.
Last year, I got doxxed, and it scared the fuck out of me. My phone was bombarded with hundreds of texts at the exact same time. The messages? pic.twitter.com/ls4qBM9k08
Several hours later, and after a raft of negative press, YouTube tweeted an update to Maza, saying that it was amending its decision. Crowder's videos could remain up, it said, but they would now be demonetized, meaning he could no longer receive ad revenue for them. That is a common defensive tactic YouTube has used before, often angering figures on both ends of the political spectrum. But, because the channel remains up, Crowder can still earn money for his side businesses, including a merch store that sells t-shirts with the slogan "Socialism is for f*gs." (YouTube said that links to these t-shirts are one of the things that need to be addressed to restore monetization.)
Amber Yust, a privacy engineering manager at Google, tweeted that demonetization is "not a sufficient or appropriate response for this situation." Yust added that "the platform is tacitly condoning attacking individuals for their innate characteristics."
This is not a sufficient or appropriate response for this situation. The core problem is not monetization but the fact that that the platform is tacitly condoning attacking individuals for their innate characteristics.https://t.co/wltmvozNIJ
Shortly afterward, YouTube also announced that it would alter its hate speech policy to remove more topics from its platform. This includes clips glorifying Nazi ideology, as well as those "denying that well-documented violent events" like the Holocaust or the Sandy Hook shooting took place.
Surely this is an isolated incident?
YouTube's reputation for nurturing a hotbed of extremist views on the far right is hardly a secret at this point. Even Fox News published an editorial condemning the site for its ability to amplify violent crimes. Professor Zeynep Tufekci has called the site "the great radicalizer," since its algorithm pushes people towards more fringe content. More than a few studies have shown that watching perfectly anodyne clips will, after not too long, send you down an extremist rabbit hole.
How can YouTube get away with publishing and profiting from this content?
The key word here is publishing, because under US law, YouTube isn't publishing them, it's hosting them. That brings it under the very broad protections of Section 230 of Title 47, United States Code, passed as part of the Communications Decency Act, 1996, or "Section 230." It was designed as a general defense that those who most material online can't generally be held accountable for its content.
In the early days of the internet, it would have been mad to hold, say, AOL or GeoCities accountable for everything it hosted. That was generally the rule, until a 1995 court case between Stratton Oakmont and Prodigy Services changed everything. Prodigy hosted a money forum where users could post their thoughts and feelings (so long as it followed the community rules). One user suggested that Stratton Oakmont was engaged in criminal activity (surprise, it was), and the firm sued. The NY Supreme Court found in favor of Stratton Oakmont, since Prodigy had a terms of service and moderators, it would be considered a publisher.
A year later, and the CDA was passed, removing such liabilities from hosts and ISPs who couldn't possibly monitor everything published under its watch. There were exceptions, for illegal and copyright-infringing works, but the general principle was set: hosts can't be held responsible for the content of other people's work.
Right! So let's do away with Section 230!
Hold your horses, friend, because if you do that, you're risking the very future of the internet as we know it. If S230 is killed off, then very few websites are going to allow you to publish anything. If the host is liable for all the content on its servers, then you can expect a mass purge as these companies desperately try to avoid a hurricane of lawsuits.
Worse, is that it's likely that we'd go from a situation of a free-ish and open-ish internet to one with direct regulation. That's something Senator Ted Cruz has already threatened Facebook with, suggesting that its S230 immunity could be removed if it isn't politically impartial. Now, that's partly because Cruz doesn't understand the law, among other things. But the end result might be that websites are censored, similar to how they are in more repressive nations.
And we've already seen what happens when the provisions of S230 are weakened by lawmakers out for blood. FOSTA/SESTA forced a number of sex work sites underground, causing harm to the people who work in the industry.
Okay, bad idea. How else can YouTube be held accountable?
YouTube has cultivated a culture where toxic and harmful conversations can take place on its site and on its dime. The fact that it is paying creators (by and large) for their videos means that it has an economic relationship different to other social media platforms. Guardian journalist Julia Carrie Wong said that YouTube's business model is more like the Uber for broadcasting than Facebook.
In which case, the best way to hold the site accountable is if its "stars," those who generate so much revenue, take a stand. And, of course, users can make their displeasure known by not using the site. Or even just canceling your YouTube Premium or YouTube TV subscriptions, citing its equivocation on hate speech. After all, without users, YouTube can't make money, and that's the real way to get these companies to do better.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.