TikTok has removed hundreds of thousands of videos for hate speech

And more than 1,300 users have been banned entirely.

SOPA Images via Getty Images

TikTok is the latest social media platform to peel back the curtain on its efforts to fight hate speech. Since the beginning of the year, the company says it’s taken down more than 380,000 videos and 64,000 comments in the US for breaking its hate speech rules. TikTok has also banned more than 1,300 users as part of the policy.

The update is the first time TikTok has shared details of its content takedowns since it first added hate speech to its community guidelines in January. “While these numbers may not reflect a 100% success rate in catching every piece of hateful content or behavior, they reflect both our commitment to action and to building a community that is more positive and welcoming than on other apps,” TikTok’s Head of Safety Eric Han wrote in a blog post.

Han also noted that TikTok has taken steps to block hateful content from appearing in the app’s search results, and the company is “training our moderation teams to better understand more nuanced content like cultural appropriation and slurs.”

TikTok, already facing scrutiny over alleged ties to the Chinese government, has repeatedly come under fire for its moderation policies. The company has been criticized for censoring content critical of the Chinese government, such as when it hid a video that criticized its treatment of Muslims (TikTok apologized for the incident). The Anti-Defamation League has also said TikTok doesn’t do enough to combat white nationalism, extremism and anti-semitism.

The company has taken a number of steps to try to assuage these concerns, The company formed a Content Advisory Council earlier this year, and plans to open a “transparency center” that will give outside experts more visibility into its security and moderation policies.