Facebook Shouldn't Be Deciding Who to Ban. Its Users Should Choose | Opinion
Last week, Facebook announced that President Trump would remain suspended from its platform for at least the next two years because of comments he made about the civil unrest in the Capitol on January 6. The decision was made by Facebook behind closed doors by staff whose names and backgrounds we do not know.
Is this the right approach for determining what is acceptable speech on a social media platform used by over a billion people on a typical day? I think it isn't.
I mean no disrespect to the subject matter experts and advisors on Facebook's payroll: I am sure, if we could review their résumés, we'd see that they are very qualified to provide the tech giant with guidance. The trouble is we don't know who they are, what their advice is, or even whether Facebook executives follow it. That failure is on Facebook alone for not being more democratic, transparent, and frankly, imaginative, about how it makes important decisions.
There's a better way. Rather than respond in an ad hoc manner to its ostensibly independent Oversight Board of hand-picked, well-remunerated experts, or try to silence speech-policing reporters in a fruitless attempt to avoid bad PR, Facebook should institute a democratically-elected People's Parliament. And it should be voted in by—and accountable to—the users of Facebook.
For it is the people who actually use Facebook to connect with their friends, family, and the causes they care about are best placed to determine Facebook's Community Standards.
With Facebook users spanning 200 countries, it's a stretch to say that staff in Menlo Park can understand and relate to the average user in Texas or Florida, let alone in Myanmar, Sudan, or Thailand. Facebook itself has said it does not want to "make so many important decisions about free expression and safety on our own." And because of the cross-border nature of the Internet and Facebook's sprawling operations, government regulation is unlikely to be effective at safeguarding lawful expression online.
Let's instead put a demographically balanced sample of Facebook users in a room and get them to write the platform's rulebook.
It might sound fanciful to propose a People's Parliament, but the idea has been tried successfully before. The Internet Corporation for Assigned Names and Numbers, or ICANN, is the California corporation that licenses the rights to use a top level domain name such as .COM or .ORG. Its policies are developed in a multi-stakeholder fashion by a mix of actors who must achieve consensus on a topic before it can be adopted.
Among these stakeholders is the At-Large Advisory Committee comprised of democratically-elected "individual Internet users" from every region of the world who develop formal advice in a bottom-up fashion, consulting their local communities and working through real-time interpretation to hash out issues and ensure that the perspectives of the individual—not just governments or businesses—are heard and considered.

Domain names, after all, are the places on the Internet in which we share our expression. Our websites and emails carry our most important thoughts and ideas. So it's fitting that no one can remove a domain name from the Internet without due process, and commercial interests can not override the interests of noncommercial speech.
It wasn't always this way, but over time, ICANN has found a way to become independent from government control and market interests, and it has done this by demonstrating its procedural fairness and accountability, and by promoting the diversity of interests of those making its rules.
Facebook could take many lessons away from how ICANN shares its power between governments, businesses, civil society, and individual Internet users. While ICANN's governance model is somewhat new, having only been practiced since 1999, it is a natural extension of the Enlightenment and Jeffersonian democratic principles: The exercise of political power is only legitimate when it is done so with the consent of the governed.
In an era of cancel culture, and when journalists are increasingly reporting on events in order to win accolades from their colleagues rather than serve any real public interest, I think it's more important than ever that Facebook leaves its business interests aside for a moment and defers to real people that actually use its platform to decide whether the content that they can see is harmful or not.
Doing so might not have changed the outcome in the case of President Trump's account. But at least the decision to do so would have been made in a more considered and accountable fashion, by people we can vote out, and not by a small group of elites imposing their values on everyone else.
Ayden Férdeline is a public interest technologist who previously represented European civil society organizations on the Council of ICANN's Generic Names Supporting Organization, the body which coordinates policy development for generic top level domain names like .COM and .ORG. He now hosts the Internet governance history podcast POWER PLAYS.
The views in this article are the writer's own.