Are Facebook, Google and Twitter the New United Nations? | Opinion

Publishing certain words on Facebook can lead you to suspension, no matter the context. Nudity? Even your elbows can be the source of censorship.

Some songs that are critical in nature of political or religious topics, posted originally on YouTube, can also cause suspension if posted on Facebook. The company says they respect democracy and freedom of speech. That may be true, but their rules for what can and cannot be posted are shady at best.

Facebook is an online platform for billions of people, whose users sometimes do not realize that by using the app they are using the internet. In developing countries, the common practice of zero rating, when a carrier provider allows its clients to access certain services, such as Facebook's WhatsApp without consuming their internet data, further helps to perpetuate the myth that Facebook is the internet in itself. People never need to leave Facebook or its subsidiaries. Public discourse ends up gated within Facebook's walls, which by default becomes a guardian of democracy.

Should Facebook be a gatekeeper of democracy and freedom of speech?

In Myanmar, several accounts linked to the military were restricted for spreading "misinformation" following a coup. That is, to some extent, good ... but should Facebook even have such power?

Maybe not, but Facebook (and other giants like Google and Twitter) have too much power on their hands.

Take Australia for example. Facebook simply cut the country's access to any and all news within its ecosystem. Pages from hospitals, Australian government sites and even local charities and community services went offline.

Facebook used its power to force Australia to abandon the idea of supporting legislation that would force the company (among others) to pay for news shared on their platform. This resulted in a complete blackout of news and information, harming several news outlets and millions of Australians.

Claire Lehmann, founder of online news website Quillette, noted that Facebook is their third source of traffic. Losing this stream of traffic would be a blow to their revenue. Quillette is just one of hundreds, possibly thousands of websites affected by the decision of a virtual monopoly to use its power to force a country to drop legislation. In the end, Facebook succeeded in making Australia back down in several key issues of the bill.

It is not a matter of agreeing or not with the bill that was ultimately approved on February 24 by the Australian parliament (the bill has all sorts of problems and goes against the principle of an open internet, even though the idea of making big tech pay news outlets is in itself good), but rather if Facebook's boycott is acceptable. It is not.

A few years ago Google decided to end Google News in Spain for a similar reason, but it didn't go as far as to block news websites from appearing in search results.

Not being on a social network website is the same as not being part of public discourse and public debates.

Governments, political leaders and journalists are online. They all share content on social media and influence the public (and to some extent, are influenced by the audience).

If whoever controls those platforms can also control which discourse is or isn't accepted, they ultimately have the power to leave entire countries in the dark when it comes to accessing information.

Online platforms should have some control over the messages flowing in their ecosystems but not the actual content, unless fake news or content that is considered illegal in each country on the platform is available. If the content is deemed criminal within a country, but within a set of acceptable, clear and straightforward rules, it should have the right to an appeal with independent verifiers.

Social media
Logos of multinational American internet technology and services companies. DENIS CHARLET/AFP via Getty Images

Facebook is not particularly known for its consistency.

According to researcher Jillian C. York in her upcoming book Silicon Values, "the effect that platforms have in shaping our very identities must not be understated."

She notes that the whole system of content moderation "must be subject to a comprehensive, external audit of both rules and processes, policies and procedures." York pushes for authentic inclusion and representation in online spaces.

Twitter is developing an anti-troll feature, or "safety mode." It will automatically block accounts that are identified as spam, insult, use strong language, or post hateful remarks. But who's to decide what is "strong language?" Will insults among friends be censored? Who's going to decide what is acceptable or not?

The absence of oversight and accountability over decision-making by tech giants is a major problem that must be addressed urgently. Facebook is right in limiting the messages of a coup government, yet we should have not have to depend on tech giants to make such decisions. Where are international and multilateral organizations?

Facebook, Twitter and Google have become a U.N. of sorts and this is far from an ideal scenario. Are these companies qualified to make life-changing and even democracy-changing decisions?

The short answer is no. But, nevertheless, that's what they often do.

In 2019, Facebook sided with Turkey and blocked a Kurdish militia's page so Facebook's business in the country wouldn't suffer. Turkey illegally invaded the Kurdish-majority region in northern Syria, but Facebook decided they didn't care.

In a recent Foreign Affairs article, the idea of saving democracy from technology was explored. Doing so included incorporating regulation, breaking up companies, data portability, highlighting privacy law and implementing middleware, a "software that rides on top of an existing platform and can modify the presentation of underlying data."

The article's arguments are supported by experts consulted by IFEX, a global network promoting freedom of speech. They highlight how platforms have serious issues when it comes to content moderation, especially moderating posts made by unaccountable individuals. According to experts, more transparency and accountability is needed when it comes to understanding how algorithms behave.

Researcher and activist Hossein Derakhshan noted how "ironic Google and Facebook etc, are covered in the news as if they are states. They really are like rouge unaccountable states."

The solutions presented by social networks are often insufficient, or worse, decided behind closed doors, without the participation of experts and precisely those most interested and invested—the users.

These solutions generally disrespect basic matters such as freedom of expression, opting to restrict discourse and embark on fashions of the moment, amplifying bubbles and creating division—which is ultimately what generates the most revenue.

Raphael Tsavkko Garcia is a Brazilian journalist based in Belgium. He holds a PhD in human rights from the University of Deusto (Spain).

The views expressed in this article are the writer's own.

Editor's pick

Newsweek cover
  • Newsweek magazine delivered to your door
  • Unlimited access to
  • Ad free experience
  • iOS and Android app access
  • All newsletters + podcasts
Newsweek cover
  • Unlimited access to
  • Ad free experience
  • iOS and Android app access
  • All newsletters + podcasts