TikTok is introducing a new feature to fight the spread of misinformation on its platform. The app will begin warning users before they share videos with unverified info. The update is meant to address a kind of gray area in the fact-checking process: claims that fact checkers are unable to verify.
With the update, users who try to share a video that’s been flagged as unsubstantiated by the app’s fact checkers will see a pop-up saying “this video has been flagged for unverified content.” They’ll still be able to go ahead and share it if they wish, but the video won’t appear in other users’ For You page. TikTok will also notify the person who originally shared the video that their post has been flagged.
TikTok is hoping the prompt will encourage “a pause for people to consider their next move before they choose to ‘cancel’ or ‘share anyway.’” The company says early tests of the warnings have reduced sharing by 24 percent. The additional friction is similar to Twitter’s experiment, which encourages users to read articles before sharing them (a test the company says has been successful.)
In some ways, TikTok has taken a more aggressive approach to misinformation than other social media platforms. The company works with a number of third-party fact-checking organizations, and removes videos they debunk. But some posts are bound to slip through the cracks, and the company has at times been forced to play catch-up, like in the aftermath of the election and violence in Washington DC. On its part, TikTok notes the new warnings should help it better address content that crops up “during unfolding events.”