ISIS, Nudity, Revenge Porn: Facebook Updates its Community Standards

03_16_facebook
Facebook has changed its user policy to clarify what constitutes nudity and how it will be dealt with on the social media website. Ognen Teofilovski/Reuters

Facebook has updated its community standards, clarifying what it considers unacceptable content.

Although its core policies have not changed, the additional guidance released Sunday on how it applies its standards—relating to such topics as dangerous organizations like the Islamic State (ISIS), nudity and revenge porn—gives users a glimpse into how decisions are made at the company.

With more than 1 billion users of varying ages, locations and backgrounds, the social media giant often finds itself doing a delicate deletion dance, one that attempts to properly balance community safety and sensibilities with free speech. In practice, Facebook's decisions sometimes appear inconsistent to its audience.

Last October, for instance, Facebook apologized to the drag community for disabling accounts that were in violation of the site's real-name policy and promised to ease enforcement. The apology, however, failed to include other groups affected by the policy, such as activists and political dissidents.

Among the most notable clarifications to the rules are those relating to nudity, which the site has been notoriously stringent about.

"We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring," the community standards read. "Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed."

As Twitter and Reddit have done in recent months, Facebook has detailed its position on revenge porn. "We...remove photographs or videos depicting incidents of sexual violence and images shared in revenge or without permissions from the people in the images," the guidelines read.

The company also elaborated on its standards for pages and posts relating to dangerous organizations. While groups like ISIS have long been banned from the platform, the new updated guidelines add, "We also remove content that expresses support for groups that are involved in...violent or criminal behavior."

In a post explaining the updated standards, Monika Bickert, Facebook's head of global policy management, and Chris Sonderby, deputy general counsel, added that content is sometimes restricted for reasons other than being in violation of community standards. Sometimes content is removed because it violates the laws of a particular country.

The Global Government Requests Report, also released this weekend, details the number of government requests to restrict content in the second half of 2014. "Overall, we continue to see an increase in government requests for data and content restrictions," Bickert and Sonderby's post read.

"The amount of content restricted for violating local law increased by 11% over the previous half, to 9,707 pieces of content restricted, up from 8,774. We saw a rise in content restriction requests from countries like Turkey and Russia, and declines in places like Pakistan," they wrote.

Bickert and Sonderby's post also emphasized that Facebook does not automatically scan for standards violations, instead relying on users to report them.

"Our reviewers look to the person reporting the content for information about why they think the content violates our standards," the post read.

If users disagree with Facebook's decision to leave reported content on its site, the company reminds them they can "unfollow, block or hide content and people they don't want to see, or reach out to people who post things that they don't like or disagree with."

ISIS, Nudity, Revenge Porn: Facebook Updates its Community Standards | Tech & Science