Facebook Grapples With Charlottesville: White Supremacy Out, White Nationalism In

In the wake of violence surrounding the Unite the Right rally in Charlottesville, Virginia, last year, Facebook grappled with how to police white nationalism and supremacy online, according to documents obtained by Vice's Motherboard.

The rally, meant to protest the removal of a statue of Confederate General Robert E. Lee and connect far-right groups of varying ideologies on race, turned deadly when one woman was killed when she was struck by a car driven by a rally participant. Facebook issued training documents referencing Charlottesville shortly after the rally and offered more guidance on white supremacy and nationalism in January, according to the documents.

"We don't allow praise, support and representation of white supremacy as an ideology," one training slide for moderators reads. "We allow praise, support and representation of white nationalism."

Hundreds of white nationalists, neo-Nazis and members of the "alt-right" march down East Market Street toward Emancipation Park during the "Unite the Right" rally on August 12, 2017, in Charlottesville, Virginia. After clashes with anti-fascist protesters and police, the rally was declared an unlawful gathering and people were forced out of Emancipation Park, where a statue of Confederate General Robert E. Lee is slated to be removed. Chip Somodevilla/GETTY

The documents, according to Motherboard, show that Facebook recognized that the distinction between the ideologies can be hard to make.

White supremacists believe "that the white race is inherently superior to other races and that white people should have control over people of other races," according to Merriam-Webster. White nationalism is distinguished by pushing for a separation of white people from other races. The two terms are often used in conjunction.

"Overlaps with white nationalism/separatism, even orgs and individuals define themselves inconsistently," Facebook's training guide pointed out, according to Motherboard.

In an April press release, Facebook shined a light on how it chooses to take down posts.

"We believe that people should be able to share their views and discuss controversial ideas on Facebook. That's why our policies allow people to criticize, and even condemn, religious institutions or political parties," said Facebook. "But we draw the line when the focus shifts to individuals or groups of people. We don't allow direct attacks based on what we call protected characteristics: race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease."

Congress last year passed a bill that was a ceremonial denunciation of white nationalism, white supremacy and neo-Nazism as "hateful expressions of intolerance that are contradictory to the values that define the people of the United States." The ideologies, long a part of the U.S., have more recently been on prominent display and that conversation has spilled on to Facebook, which according to the Pew Research Center, is used by around 68 percent of Americans.

In a statement to Newsweek, Facebook said that it does not allow hate groups or individuals on the site.

"Using a combination of technology and people we work aggressively to root out extremist content and hate organizations from our platform. We evaluate whether an individual or group should be designated as a hate figure or organization based on a number of different signals, such as whether they carried out or have called for violence against people based on race, religion or other protected categories," Facebook said. "Online extremism can only be tackled with strong partnerships which is why we continue to work closely with academics and organizations to further develop and refine this process."