In a piece for UK newspaper The Telegraph, the CEO revealed that Instagram is also further investing in "engineers and trained content reviewers" who will make it harder to find said content. The platform already blocks images of cutting from appearing in search, hashtags and account recommendations. Mossseri added that Instagram will "better support people who post images indicating they might be struggling with self-harm or suicide."
In its community guidelines, Instagram states that it removes posts that encourage "people to embrace self-injury." But, according to the BBC, Instagram has previously said that it doesn't automatically delete distressing content because it's been advised by experts that allowing users to share their stories with its wider community can help them in the recovery process.
In addition, its efforts to protect users from abuse included the ability to disable comments in 2016. That same year, Instagram rolled out suicide prevention tools that spanned reporting options for self-harm posts and pop-ups containing support info for related hashtags. Its parent Facebook also began rolling out its AI suicide prevention tools globally in November, 2017.
For those in crisis and in need of immediate help, please visit the National Suicide Prevention Lifeline or call 1-800-273-8255. UK users can visit the Samaritans website or call 116 123.