Facebook Didn't Fix Systems That Allowed Cartels, Traffickers to Repeat Behavior, Docs Show

Facebook has failed to fix systems that allow human trafficking organizations and drug cartels to repeat the same behavior, according to reporting from the Wall Street Journal.

The dozens of internal Facebook documents obtained by the outlet showed how employees have expressed concerns about how the social media giant is being used in countries across the globe and how Facebook has failed to properly respond to these issues.

Some of the documents reportedly showed that Facebook employees raised concerns about human trafficking organizations in the Middle East that used Facebook to attract women. Other documents showed Facebook employees alerting their higher-ups of groups involved in organ selling and pornography.

The news outlet reported that while some of the groups and pages flagged by employees have been taken down, dozens of others remain active on the social media site.

Another document detailed a Facebook employee's investigation into a Mexican drug cartel that was active on the social media site. The employee, who was a former police officer, was able to identify the Jalisco New Generation Cartel's network of accounts on both Facebook and Instagram, which is owned by Facebook.

The employee wrote in the report that his team had found Facebook messages between cartel recruiters and potential recruits "about being seriously beaten or killed by the cartel if they try to leave the training camp."

Facebook
Dozens of internal Facebook documents show that the social media giant failed to adequately respond to employee's concerns about violent groups and pages. Above, a picture taken on November 20, 2017, shows logos for Facebook. Loic Venance/Getty

The documents reportedly showed that the cartel was open about its criminal activity, with several pages on the social media site showing "gold-plated guns and bloody crime scenes."The Wall Street Journal reported that even after the employee recommended Facebook increase its enforcement on the groups, documents showed that Facebook didn't completely remove the cartel from its site and instead said that it removed content tied to the group. Just nine days after the report from the employee, his team found a new Instagram account tied to the cartel, which included several violent posts.

Many of the documents apparently showed employees raising concerns about how the social media giant was being used in developing countries, such as militant groups in Ethiopia using Facebook to promote violence against minority groups.

Brian Boland, a former Facebook vice president, told the Wall Street Journal that the social media site sees these issues in developing countries as "simply the cost of doing business."

"There is very rarely a significant, concerted effort to invest in fixing those areas," Boland said.

In a statement sent to Newsweek, a Facebook spokesperson said: "In countries at risk for conflict and violence, we have a comprehensive strategy, including relying on global teams with native speakers covering over 50 languages, educational resources, and partnerships with local experts and third-party fact-checkers to keep people safe."

In a series of tweets on Thursday, Facebook spokesman Andy Stone wrote, "As the Wall Street Journal itself makes clear, we have a team of experts who help us uncover patterns of harmful behavior so we can disrupt it. We've got arguably more experts and resources dedicated to this work than any other consumer technology company in the world."

"While there is always more we can do, these teams have helped us to find and disrupt gangs and traffickers operating on our platform," Stone wrote in a subsequent tweet. "We use a variety of tools against criminal organizations, including designating them under our dangerous organizations policies, human review, a wide range of AI and network disruptions."

We use a variety of tools against criminal organizations, including designating them under our dangerous organizations policies, human review, a wide range of AI and network disruptions.

— Andy Stone (@andymstone) September 16, 2021

Stone concluded his thread of tweets by writing, "We know we have more work to do, which is exactly why we hire specialists in key fields to help us do research and understand the problems so that we can improve our technology, staffing and policies to address them."

We know we have more work to do, which is exactly why we hire specialists in key fields to help us do research and understand the problems so that we can improve our technology, staffing and policies to address them.

— Andy Stone (@andymstone) September 16, 2021

In 2018, Facebook said that it agreed with a report from the non-profit organization Business for Social Responsibility that it wasn't "doing enough to help prevent our platform from being used to foment division and incite offline violence," in Myanmar, following violence against the Rohingya minority.

In a statement issued following that report, Facebook Product Policy Manager Alex Warofka said, "We agree that we can and should do more."