Earlier today, Recode's Kara Swisher released an extensive interview with Facebook CEO Mark Zuckerberg covering the platform's struggles during a long, scandal-ridden year. Nestled inside was an exchange where Swisher pressed the executive on why it allows some conspiracy theorists to be allowed to post on the platform, regardless of the truth of their statements -- and he explicitly explained that these users, including Holocaust deniers, deserve a voice. This predictably kicked up a ruckus online, and Zuckerberg emailed a clarification to Recode reaffirming that he finds Holocaust deniers "deeply offensive" and didn't intend to defend them. But he did state Facebook's goal: Not to stop fake news, but prevent it from spreading.
Here's the full email, per Recode:
I enjoyed our conversation yesterday, but there's one thing I want to clear up. I personally find Holocaust denial deeply offensive, and I absolutely didn't intend to defend the intent of people who deny that.
Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed. These issues are very challenging but I believe that often the best way to fight offensive bad speech is with good speech.
I look forward to catching up again soon.
In the original interview, Swisher asked Zuckerberg why Facebook allows users like those that promote Sandy Hook denials to keep spreading their content while cracking down on fake news in Myanmar or Sri Lanka. "The principles that we have on what we remove from the service are: If it's going to result in real harm, real physical harm, or if you're attacking individuals, then that content shouldn't be on the platform," Zuckerberg said.
While Facebook took action when misinformation that spread on its platform led to violence in Myanmar and Sri Lanka, the social media company isn't taking as aggressive a stance on fake news elsewhere. But the company itself is notably opaque on when content deserves to be banned. In a Congressional hearing yesterday, Facebook's president for global policy Monika Bickart was presented with multiple pages that remained after their violent posts were removed and asked why they weren't wiped from the platform. "If they posted sufficient content that violated our threshold, that page would come down. That threshold varies, depending on the severity of different types of violations," she said.