suicide
Latest
Twitter restores suicide-prevention feature after briefly removing it
Twitter says it’s working on bringing back the #ThereIsHelp banner, a feature that pointed users to suicide prevention hotlines and other safety resources when searching for certain content.
Twitch clarifies its self-harm policy
Twitch has updated its self-harm policy to clarify what streamers are allowed to mention.
Meta and Snap sued by mother over alleged role in her daughter's suicide
A Connecticut mother has brought a lawsuit against Meta and Snap claiming the platforms were designed to cause the sort of addiction her late daughter suffered prior to taking her own life.
TikTok takes more action against hoaxes and dangerous challenges
Almost half of teens want more information about how to understand the risks of online challenges, a survey found.
FCC proposes text support for the National Suicide Prevention Lifeline
An FCC proposal would let people text the National Suicide Prevention Lifeline when calling isn't an option.
TikTok adds warnings to search results for 'distressing content'
TikTok is adding new warnings to its in-app search that will alert users when results may include “distressing content.”
Facebook blocks terminally ill French man from livestreaming his death
Facebook said it would block a terminally ill French man from livestreaming his death, raising questions about its broadcast policies.
FCC makes 988 the 3-digit number for the National Suicide Prevention Lifeline
FCC designates 988 as the 3-digit number for the National Suicide Prevention Lifeline.
Instagram bans drawings and memes linked to self-harm
Instagram is continuing to crack down on graphic images posted on its platform, following an outcry over the death of British teenager Molly Russell in 2017. Russell took her own life after seeing graphic suicide-related images on both Instagram and Pinterest.
Facebook says it's doing more to prevent suicide and self-harm
In recognition of World Suicide Prevention Day, Facebook shared three additional steps it's taking to prevent suicide and self-harm. On top of changes Facebook made in the past year, the company says it's hiring a health and well-being expert to join its safety policy team. Facebook plans to share its social media monitoring tool, CrowdTangle, with select academic researchers who will explore how Facebook and Instagram can further advance suicide prevention. And the company is including Orygen's #chatsafe guidelines in Facebook's Safety Center and in resources on Instagram when someone searches for suicide or self-injury content.
FCC proposes '988' for quick access to national suicide prevention line
The FCC wants accessing a national suicide prevention line to be as simple as dialing 988. In a report sent to Congress today, staff members recommend that the FCC designate 988 as the 3-digit dialing code for a nationwide suicide prevention and mental health crisis hotline.
UK university will study students' social media data to prevent suicide
A university in the UK is planning to use data analytics to help prevent student suicide. Northumbria University, and a handful of partner organizations, will collect data from students' social media accounts to create an "Early Alert Tool." If successful, it will identify students in crisis so the university can provide aid.
Instagram will hide self-harm images behind 'sensitivity screens'
Instagram will hide images that show self-harm behind "sensitivity screens," according to the platform's head Adam Mosseri. The new feature will blur the offensive material until a user actively chooses to view it. It's all part of the platform's efforts to combat the spread of images that depict suicide or self-harm following the suicide of British teen Molly Russell. Her parents believe that Russell, 14, took her own life after seeing graphic images of self-harm on Instagram and Pinterest. Mosseri, who took over the job after the departure of Instagram's co-founders Kevin Systrom and Mike Krieger last September, is also meeting with UK health secretary Matt Hancock this week.
School internet filter maker launches suicide risk detector
A company that makes internet filters and Chromebook management software for schools is launching a product today that detects when K12 students are at risk of suicide or self harm. GoGuardian serves about 4,000 school districts in the US, totaling about 5.3 million students, and is meant to act as "an early-warning system to help schools proactively identify at-risk students to quickly get them the assistance they need."
Netflix renews controversial '13 Reasons Why' for a third season
Netflix's controversial teen drama 13 Reasons Why is returning for a third season in 2019, even though many believed the second season was unnecessary. For one thing, the first season was based on a book and the two ended at around the same point, leading to some suggesting Netflix was milking the subject matter. The streaming giant confirmed the show's return with a teaser video.
Logan Paul hasn’t learned his lesson
Logan Paul, the YouTube star who came under fire recently after posting a video of a corpse, is at the center of yet another controversy. This time around, Paul is facing backlash for uploading a video in which he's seen shooting two lifeless rats with a Taser gun. As if that wasn't enough, in a now deleted tweet, he joined the Tide Pods internet challenge, suggesting he'd eat one of the detergent capsules for every retweet he got. Perhaps that's just his sense of humor, but Paul should have known that everything he does from now on will be heavily scrutinized.
Logan Paul returns to YouTube with suicide prevention video
It has been a little over three weeks since YouTuber Logan Paul posted his now infamous Aokigahara forest video and aside from an apology, Paul has been largely silent on his channel. But today, Paul posted a new video, one that's quite different from his usual content.
Logan Paul may face ‘further consequences’ from YouTube
Earlier this month, YouTube star Logan Paul was hit with a wave of criticism over his decision to post a video showing a suicide victim in Japan's Aokigahara forest. Posted to his YouTube channel, the video showed him and his friends entering the forest, -- well-known for being a place where many choose to end their lives -- coming across a body and laughing while they made jokes and moved in for closeups. The video was removed, but many have called for YouTube to do more, both with Paul specifically and with how it manages the content that goes up on its site. The company has been fairly quiet since the incident, but today, it has finally released a statement.
Canada will track suicide risk through social media with AI
The Canadian government is partnering with AI firm Advanced Symbolics to try to predict rises in regional suicide risk by monitoring social media posts. Advanced Symbolics will analyze posts from 160,000 social media accounts and will look for suicide trends. The company aims to be able to predict which areas of Canada might see an increase in suicidal behavior, which according to the contract document includes "ideation (i.e., thoughts), behaviors (i.e., suicide attempts, self-harm, suicide) and communications (i.e., suicidal threats, plans)." With that knowledge, the Canadian government could make sure more mental health resources are in the right places when needed.
YouTube star faces backlash over clip showing a corpse
YouTube star Logan Paul is facing major backlash over a video he recently posted on his YouTube channel. In it, he and a few of his friends who are traveling through Japan enter the Aokigahara forest near Mount Fuji claiming to be documenting the "haunted aspect of the forest," as Paul says in the video. But the forest is well known for being a place where many people go to commit suicide. While in the forest, Paul's group comes across a body and not only does the video show the body (with the face blurred out), Paul and his friends are shown laughing and making jokes.