Posts that include racial slurs don't always violate Facebook's guidelines, for instance, if they were written to condemn the use of those slurs. Same with images that feature nudity if they have historical significance or were created to raise awareness. Facebook wants to give you the choice to filter out these "borderline" content in order to protect yourself from anything that might be harmful to your well-being or simply because you don't want to see them on your feed. "[B]y giving people individual control, we can better balance our principles of free expression and safety for everyone," he wrote.
In addition, the social network plans to create a new system for content takedown appeals next year. It's apparently leaving the decision on whether to grant or deny an appeal to an independent body to "provide assurance that these decisions are made in the best interests of [the] community and not for commercial reasons." Facebook is still figuring out how it would select members for the panel and how it would work exactly, but it's hoping to start piloting the system in early 2019.
"Over time, I believe this body will play an important role in our overall governance. Just as our board of directors is accountable to our shareholders, this body would be focused only on our community. Both are important, and I believe will help us serve everyone better over the long term."
Facebook COO Sheryl Sandberg has penned a letter in response to NYT's expose, as well. Like Zuckerberg, she also denied knowing that Facebook hired PR firm Definers and that the consultants redirected criticisms against the platform by using unscrupulous tactics. That despite Facebook's previous statement that its relationship with Definers "was well known by the media," because it has sent out press invites on its behalf in the past.