Facebook warns users who 'interacted' with COVID-19 misinformation
It will direct you to reliable information from the World Health Organization.
As part of Facebook’s ongoing fight against COVID-19 misinformation, it is going to begin alerting users if they have liked, reacted to or commented on misinformation that has since been removed. The alerts will appear in News Feed, and they’ll include links to COVID-19 myths debunked by the World Health Organization (WHO). You can expect to see the alerts in the coming weeks.
Facebook has taken similar steps in the past. A couple years ago, it began notifying users if they liked or followed bogus pages created by the Internet Research Agency -- the Russian troll farm responsible for meddling in the 2016 election. Facebook has also urged users who search for vaccine-related content to visit credible sources, like the WHO and CDC.
Last year, Facebook announced a three-part plan for tackling “problematic” content: remove, reduce, inform. Facebook has already taken steps to remove and reduce COVID-19 misinformation, and these efforts fulfill the “inform” part of the plan. Facebook will also expand its “Get the Facts” COVID-19 information center to Facebook News in the US.
According to a blog post, Facebook says that in March, it displayed warnings on about 40 million posts related to COVID-19, based on approximately 4,000 articles flagged by its fact-checking partners. Allegedly, when users saw those warning labels, 95 percent of the time, they did not view the content.