TikTok is adding to its in-app search that will alert users when results may include “distressing content.” The app has employed warnings on individual videos since last year, but the updated alerts will appear in search results for terms that could include such content.
In a blog post, TikTok uses the example of “scary makeup” as a search term that may prompt such a warning. The company notes that users will be able to click through the warning to view results anyway, and that individual videos deemed “graphic or distressing” are ineligible from the app’s recommendations.
TikTok is also changing up search results to provide more resources on searches related to suicide and self harm, the company said. In addition to surfacing links to reach helplines like the Crisis text Line, the app will also point users to “content from our creators where they share their personal experiences with mental well-being, information on where to seek support and advice on how to talk to loved ones about these issues.”
The app has at times struggled to deal with content related to self harm. Last year, a video of a suicide, originally streamed to Facebook Live, on TikTok as the company scrambled to take down new copies. But even as users came up with workarounds to skirt TikTok’s detection, other creators posted viral clips urging users not to engage with the content. That suggests that TikTok’s plan to rely on creators to share positive PSAs could be an effective strategy for the company.
In the U.S., the number for the National Suicide Prevention Lifeline is 1-800-273-8255. Crisis Text Line can be reached by texting HOME to 741741 (US), 686868 (Canada), or 85258 (UK). TikTok has published a for other countries.