Facebook’s ‘privacy-focused’ plan is another diversion

Creating more private echo chambers may not be such a good idea.

When Mark Zuckerberg took the stage at F8 2019, he once again outlined the company's new "privacy-focused" vision. It's a message he's been spreading over the past few months, and it will focus on six key principles: encryption, interoperability, ephemerality, safety, secure data storage and private interactions. While Zuckerberg went all in on how Facebook-owned apps will soon work seamlessly together, and how private conversations will play a key role, he seemed unaware the new plan could create problems of its own.

Zuckerberg told the audience that people want to communicate freely in private because they can be themselves, but he should know that that's not always a good thing. After all, it's features like Groups -- which Facebook is now putting front and center -- that have created echo chambers where toxic communities thrive. Perhaps the most concerning part of Facebook's "the future is private" strategy, is that it hangs on a "community review process with fairness in mind," which relies on moderators to flag abusive and harmful content in Groups. That, of course, includes misinformation, hate speech, nudity, bullying, harassment and violent posts -- which the company is also trying to combat with artificial intelligence.

Facebook says there are now more than 400 million people who belong to a "meaningful" group on the site, with Zuckerberg noting that the idea is to make such groups "as central as friends." The problem with relying on moderators to police content, however, is that Facebook is essentially placing flagging responsibilities on third parties. In this case, it could be users who may not have the best intentions in mind, and who themselves may be looking to create trouble.


While there's no doubt that there are positive communities on Facebook, it's well-documented how Groups have been exploited by those looking to spread propaganda, fake news and harassment. Yes, private conversations can be great for users who want to feel safe in an online group, but they can also be used to create toxic echo chambers -- the same kind that Facebook has to take down regularly for "coordinated inauthentic behavior." With the company's new privacy-focused vision, harmful Groups may be harder to trace. Not just by Facebook, but by governments, law enforcement, legal experts, researchers and the media.

This could create a dangerous precedent for Facebook, to say the least, at a time when it's still trying to clean up its platform and salvage its damaged reputation.

Beyond focusing on more private groups and relying on moderators, Zuckerberg points to WhatsApp as an example of how Facebook will rework its family of apps to private and encrypted services. The problem, however, is that WhatsApp is far from perfect. In 2018, the spread of misinformation on the app was so bad that it was blamed for inciting lynchings in India. And that wasn't the only time WhatsApp was connected to violence in Southeast Asia. In Myanmar, pervasive hate speech and hoaxes have led to serious issues across the country, which Facebook is still trying to control.

"I know that we don't exactly have the strongest reputation on privacy right now, to put it lightly," Zuckerberg said, jokingly with a giant smile on his face. "But I'm committed to doing this well." Still, he didn't address how his privacy-focused vision is going to solve the problems that have led to lack of trust and calls for tougher regulation. "It seems like another Facebook diversion," Nancy Kim, a professor of law and internet studies at the California Western School of Law, said about Facebook's newfound strategy. "It's intended to placate the public and probably their own employees."

Kim said that although Zuckerberg's plan may exacerbate toxic content from private groups, her biggest concern is that Facebook continues to lure people into relying more and more on its services -- which is helped by the fact it owns some of the biggest social networks in the world. "[Zuckerberg] is trying to lure more people into disclosing more [private information]," she said. "That's going to make them more vulnerable. When you make yourself more vulnerable, you have more data out there. You're easier to manipulate and you're easier to exploit."

Asha Sharma, head of consumer product at Facebook Messenger, told Engadget that the company needs to be responsible about its approach to ensure that, no matter how private a community may be, it remains safe for everyone on Facebook. "We are working with all of the safety teams across the company, including Messenger, to make sure we do this the right way," she said. "We also know that the community wants to be able to communicate privately. We're going to build products in order to do that responsibly." Sharma pointed to Facebook's detailed efforts to manage problematic content, which are meant to "remove, reduce and inform," noting that the plan is to do a "bunch more to help prevent" these issues.

During Facebook's latest earnings call, Zuckerberg said that his privacy-focused vision is going to be a central focus for the company for "the next five years or longer." He said there are "a lot of open questions and real tradeoffs on important social issues," which is why he's "committed to working openly on this and consulting with experts and governments as we go."

But, if he really wants people to believe him, he's going to have to be clearer about how exactly he plans to tackle the problems Facebook has had to deal with in recent years, particularly around Groups. Because his word simply isn't enough anymore -- especially if he keeps making jokes about users' privacy, as he did this week at his flagship developers conference. Too soon, Zuckerberg. Too soon.