Sponsored Links

Facebook blocks terminally ill French man from livestreaming his death

It raises questions about Facebook's live broadcast policies.
Alain Cocq, suffering from an orphan desease of the blood, rests on his medical bed on August 12, 2020 in his flat in Dijon, northeastern France. - Alain Cocq appeals to the French President to receive the authorization from the medical profession to prescribe a barbiturate. "I am not asking for assisted suicide or euthanasia," he defends himself. "But an ultimate care. Because I am just trying to avoid inhuman suffering", which the Leonetti law currently does not allow regarding the end of life, according to him. Alain has a telephone appointment on August 25, 2020 with the health advisor of the presidency, Anne-Marie Armanteras. (Photo by PHILIPPE DESMAZES / AFP) (Photo by PHILIPPE DESMAZES/AFP via Getty Images)
Jon Fingas
Jon Fingas|@jonfingas|September 5, 2020 1:08 PM

Facebook is dealing with a test of its policies barring live streams involving suicide and self-harm. The social media giant told the AFP (via The Verge) it would block terminally ill French man Alain Cocq from livestreaming his death. The French government had denied Cocq’s request for euthanasia, and he hoped to use the stream to rally support for his cause as he ended his life in the days ahead by refusing food and medicine.

A Facebook spokesperson said the company respected Cocq’s desire to “draw attention to this complex question,” but that its rules forbade livestreaming suicide attempts and that it had taken steps to block livestreams after listening to “expert advice.”

Cocq has a rare condition that leads his artery walls to stick together. He had reached out to President Emmanuel Macron to allow for euthanasia, but the leader declined saying that he “respect[ed]” the effort but couldn’t go beyond the law.

Turn on browser notifications to receive breaking news alerts from Engadget
You can disable notifications at any time in your settings menu.
Not now

Cocq wasn’t deterred by Facebook’s restrictions. He promised a “back-up” solution for the video feed within a day, but didn’t say what service he might use next. YouTube and other video giants also have rules barring the promotion of suicide and self-harm.

It’s not surprising that Facebook would take this step. It has ramped up its suicide prevention measures for years, relying on AI and “sensitivity screens” to either block material or keep it out of sight for people who aren’t intentionally looking for it. The company had high-profile incidents in the past, and might not want to risk videos like this spurring others.

At the same time, this illustrates Facebook’s ongoing challenges with policing videos — the circumstances can vary widely, and a measure meant to protect some users might hurt others. The social network’s Oversight Board could theoretically address issues like this, but it isn’t expected to be ready before late fall. Until then, its decisions on sensitive topics are final.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.