Facebook Knew How to Combat Hate Speech in Middle East, Did Nothing: Whistleblower

Facebook's internal knowledge of their failures has extended to perpetuating hate speech in the Middle East as a whistleblower exposed systemic problems, including language barriers.

Last month, former Facebook product manager Frances Haugen came forward with a large collection of company documents backing up her accusation that Facebook pursues profit over safety.

Haugen accused Facebook of knowing that its platforms harmed young children's psyches. Such systematic flaws are not limited to mental health, and Haugen cited Facebook for inciting politically-driven violence.

In Haugen's files, documents exposed how hate speech and terrorist content is perpetuated in the Middle East because the company does not have enough moderators who speak local languages and understand cultural nuance.

Facebook's platform has not developed artificial intelligence solutions that can monitor potential threats in various languages even though Facebook "was never built with the intention it would one day mediate the political speech of everyone in the world," said Eliza Campbell, director of the Middle East Institute's Cyber Program.

"But for the amount of political importance and resources that Facebook has, moderation is a bafflingly under-resourced project," said Campbell.

According to the Associated Press, the company has tried to hire staff who spoke local dialect; however, Arabic content moderation "still has a long way to go," the company said.

Facebook's content supervision can also be read as censorship which limits free speech.

On the company's Dangerous Individuals and Organizations list, one document reads, "We were incorrectly enforcing counterterrorism content in Arabic," noting the current system "limits users from participating in political speech, impeding their right to freedom of expression."

For more reporting from the Associated Press, see below.

Facebook Middle East
FILE – In this Oct. 15, 2021, mourners chant slogans as they hold a placard with Arabic that reads "Our choice is resistance" during the funeral of three Hezbollah supporters who were killed during clashes, in the southern Beirut suburb of Dahiyeh, Lebanon. Internal company documents from the former Facebook product manager-turned-whistleblower Frances Haugen show that in some of the world's most volatile regions, terrorist content and hate speech proliferate because the company remains short on moderators who speak local languages and understand cultural contexts. AP Photo/Bilal Hussein, File

Linguists described Facebook's system as flawed for a region with a vast diversity of colloquial dialects that Arabic speakers transcribe in different ways.

Facebook first developed a massive following in the Middle East during the 2011 Arab Spring uprisings, and users credited the platform with providing a rare opportunity for free expression and a critical source of news in a region where autocratic governments exert tight controls over both. But in recent years, that reputation has changed.

Scores of Palestinian journalists and activists have had their accounts deleted. Archives of the Syrian civil war have disappeared. And a vast vocabulary of everyday words have become off-limits to speakers of Arabic, Facebook's third-most common language with millions of users worldwide.

"The stereotype that Arabic is one entity is a major problem," said Enam al-Wer, professor of Arabic linguistics at the University of Essex, citing the language's "huge variations" not only between countries but class, gender, religion and ethnicity.

Despite these problems, moderators are on the front lines of what makes Facebook a powerful arbiter of political expression in a tumultuous region.

Despite Facebook's public promises and many internal reports on the problems, the rights group Global Witness said the company's recommendation algorithm continued to amplify army propaganda and other content that breaches the company's Myanmar policies following a military coup in February.

In much of the Arab world, the opposite is true — the company over-relies on artificial-intelligence filters that make mistakes, leading to "a lot of false positives and a media backlash," one document reads. Largely unskilled human moderators, in over their heads, tend to passively field takedown requests instead of screening proactively.

Hassan Slaieh
Palestinian journalist Hassan Slaieh shows his blocked Facebook page during an interview in Gaza City, Monday, Oct. 18, 2021. A hashtag for one of Islam's holiest sites is banned, scores of Palestinian journalist accounts are blocked, and the archives of the Syrian civil war have disappeared. A vast vocabulary of common words are off-limits to speakers of Arabic, Facebook's third-most popular language with millions of users worldwide. AP Photo/Adel Hana

Arabic poses particular challenges to Facebook's automated systems and human moderators, each of which struggles to understand spoken dialects unique to each country and region, their vocabularies salted with different historical influences and cultural contexts.

The Moroccan colloquial Arabic, for instance, includes French and Berber words, and is spoken with short vowels. Egyptian Arabic, on the other hand, includes some Turkish from the Ottoman conquest. Other dialects are closer to the "official" version found in the Quran. In some cases, these dialects are not mutually comprehensible, and there is no standard way of transcribing colloquial Arabic.

Editor's pick

Newsweek cover
  • Newsweek magazine delivered to your door
  • Unlimited access to Newsweek.com
  • Ad free Newsweek.com experience
  • iOS and Android app access
  • All newsletters + podcasts
Newsweek cover
  • Unlimited access to Newsweek.com
  • Ad free Newsweek.com experience
  • iOS and Android app access
  • All newsletters + podcasts