Facebook will ban new political ads in the week before US elections

The move is aimed at preventing voter fraud and misinformation.

Sponsored Links

Facebook taking new measures to protect US elections
Facebook

Facebook has announced that it will block new political ads a week before US elections, as part of a new campaign to protect voting integrity. “I’m concerned about the challenges people could face when voting,” said CEO Mark Zuckerberg in a Facebook post. “I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.”

The idea is to stop political groups from spreading misinformation with new ad campaigns so soon before elections, as there wouldn’t be enough time to fact check them. However, candidates and political actions committees (PACs) would still be able to buy ads for campaigns in the last week, as long as the campaigns started before October 27th. “Those ads will already be published transparently in our Ads Library so anyone, including fact-checkers and journalists, can scrutinize them,” said Zuckerberg.

Twitter has been blocking political ads since last November, figuring that it’s too easy for groups to use them to spread misinformation. Google, meanwhile has limited micro-targeting in ad campaigns. Facebook has continued to sell political ads, but started labeling and noting who paid for them back in April. Zuckerberg said that the new ad ban might hurt efforts to get the vote out, but Facebook is balancing that against the potential for fraud right before the election.

Facebook is also taking other measures, including putting its Voter Information Center at the top of Facebook and Instagram feeds, showing “accurate, verified information and videos about how to vote.” It will also use the hub to let US users know that the Presidential victor might not be declared on the night of the election because of delays in counting mail-in ballots.

Facebook is taking additional measures:

  • We’ll remove posts that claim that people will get COVID-19 if they take part in voting, and we’ll attach a link to authoritative information about the coronavirus to posts that might use COVID-19 to discourage voting.

  • We will attach an informational label to content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods, for example, by claiming that lawful methods of voting will lead to fraud. 

  • If any candidate or campaign tries to declare victory before the final results are in, we’ll add a label to their posts directing people to the official results from Reuters and the National Election Pool.

Zuckerberg said that Facebook will also expand voter suppression policies by removing false information that could cause someone to miss out on the vote. “We're now expanding this policy to include implicit misrepresentations about voting too, like ‘I hear anybody with a driver's license gets a ballot this year,’ because it might mislead you about what you need to do to get a ballot, even if that wouldn't necessarily invalidate your vote by itself,” he said.

The company will also work with election officials to identify voter suppression-related posts, starting today. Finally, Facebook will temporarily restrict the number of people you can forward links to in Messenger to five per message, a policy that kicked in earlier on August 17th. “I believe our democracy is strong enough to withstand this challenge and deliver a free and fair election — even if it takes time for every vote to be counted,” Zuckerberg wrote.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Popular on Engadget