How Facebook Decides When to Censor Videos of Murder and Suicide—and When to Show Them

Imagine that two Facebook users post videos of people "self-immolating," or setting themselves on fire. Both videos contain graphic footage that's hard to look at, information about the context, and a caption interpreting it. Maybe one user thinks it's an important display, and one thinks it's funny.

How does Facebook know which one to censor?

Allowing certain types of graphic videos—like those containing murder and suicide—on the wildly popular platform could be seen as supporting their content and the exploitation of the victims therein. Taking them down could be seen as censorship, potentially obscuring important news about upsetting things from being known.

CastileProtest
Protesters demonstrate after the fatal shooting of Philando Castile. Facebook Live video showed the officer pointing a gun at a bleeding Castile after the officer shot him. Should this kind of video be allowed on Facebook? ROBYN BECK/AFP/Getty Images

There are plenty of examples of images of violence shared online becoming important news items. The Facebook Live video of a police officer fatally shooting Philando Castile had national repercussions, and continued a longstanding conversation about police violence against black people. Another upsetting image that went viral on social media shows two people passed out from drug abuse while a child sits in the backseat of a car. The image had profound political impacts.

On the other hand, Facebook likely does not want to become a source for those seeking and celebrating morbid imagery. Consider the 2012 incident in which porn actor Luka Magnotta took a video of himself killing a student. Magnotta sent the video to Mark Marek, owner of Bestgore.com, a website specifically for real gory horror. Marek posted the infamous video, which gave Magnotta exactly the attention he sought—but it also led to Magnotta's arrest. In spite of this, Canadian courts convicted Marek for posting the video, saying that the post "corrupted morals."

It's possible that U.S. courts could start making similar convictions. After the passing of controversial bills SESTA (Stop Enabling Sex Traffickers Act) and FOSTA (Fight Online Sex Trafficking Act,) platforms like Facebook have to beware. Congress has increasingly started to hold internet platforms, on which users can usually post whatever they want without review by formal staff of the company, accountable for the things that those users post. Some criticize these laws as being detrimental to a free internet, and they are responsible for the takedown of Backpage.com.

Facebook has community standards that explain, in some detail, what is and isn't allowed, and what may be posted with a warning screen. When it comes to graphic violence, users are not allowed to post images, including videos, live or not, of real people or animals that contain "enjoyment of suffering" or "erotic response to suffering," for example. However, images of mutilation and dismemberment "in a medical setting" and "self-immolation when that action is a form of political speech or newsworthy," as well as some more graphic examples, are allowed. In those cases, Facebook will put a warning screen that people have to click through if they really want to see the graphic content.

Currently, Facebook uses human moderators and AI systems to scan for and remove content that they believes violate their standards, Futurism reports. AI on chips would be more efficient and take less computing power.

Violence lies at the intersection of horror and news. It's up to the law, internet platforms, and individuals, to decide what content should be kept in the dark, and which should be brought to light.