Section 230 Reform is a Hammer. Not Every Problem is a Nail | Opinion

In the aftermath of Twitter banning President Donald Trump—a move criticized by leaders from around the world—the American Principles Project (APP) proposed a litmus test for Section 230 reform proposals: "Will the reform require, or at least strongly incentivize, Facebook and Twitter to un-ban President Donald Trump?" The problem is, it's unlikely that any Section 230 reform, even a full repeal of the statute, would have that effect on its own. For many problems in tech policy—not just censorship—Section 230 reform is not a magical panacea.

Section 230 of the 1996 Communications Decency Act shields digital platforms, such as Facebook and Twitter, from liability for third-party content. This shield also extends to the platforms' content moderation decisions, including Twitter's decision to ban Trump. But even if you remove the liability shield of Section 230, you will still need another tool—a sword—that lets you hold digital platforms liable for their conduct in the first place.

In the case of Backpage, a website for classified ads, that sword was easy to identify. Sex trafficking was rampant on Backpage, and some of the site's conduct ran afoul of federal sex trafficking laws. Using those laws as the sword, three victims of sex trafficking on Backpage filed a lawsuit. In the First Circuit Court case Doe v. Backpage, though, Backpage used Section 230 to shield itself from liability.

The solution there was straightforward: carve federal sex trafficking laws out of Section 230, so that 230's shield cannot be used there. A law called FOSTA did exactly that (among other things). This change has proven to be effective. Victims have since used it in lawsuits against Twitter, Facebook and Pornhub.

Another bill, called EARN IT, takes a similar approach by carving federal and state child pornography laws out of Section 230; Twitter has successfully used Section 230 as a shield against federal child porn laws. When a specific law is carved out of section 230, it becomes pretty clear that it can be used as the sword.

The issue of censorship is more difficult. In a world without Section 230, if Twitter bans President Trump, and Trump sues Twitter, which law can be used as a sword against Twitter? The honest truth is that an effective sword for this scenario likely does not exist, even without Section 230. The Left has also proposed 230 reform as an antidote to misinformation and hate speech, but those proposals suffer from the exact same problem: the lack of an effective sword.

Social media logos
PARIS, FRANCE - JANUARY 13: In this photo illustration, the logos of social media applications Messenger, WhatsApp, Twitter, MeWe, Telegram, Signal and Facebook are displayed on the screen of an iPhone on January 13, 2021 in Paris, France. Since WhatsApp announced a change to its privacy rules, users are moving to other encrypted messaging, 25 million users have joined Telegram secure messaging in the past 72 hours, Russian founder Pavel Durov announced on Tuesday. Chesnot/Getty Images

It's a common misconception that "if Section 230 does not protect X, then Big Tech will be held accountable for X." That statement is not always true. When trying to address any Big Tech problem, we need to ask which law can be used as a sword against that problem. That is, which law will hold Big Tech accountable for "X?" In many cases, those questions do not have a solid answer. (The other major problem is that Section 230 also shields smaller tech companies, not just the Big Tech companies.)

This misconception also applies to some criticisms of 230 reform bills. EARN IT would remove Section 230's shield when the sword is a federal or state child porn law. Critics claim that EARN IT is a dangerous threat to encryption—but which child porn law takes a sword to encryption? What specific language in those laws attacks encryption? The more you press for specifics, the weaker these claims become.

Putting carve-outs in Big Tech's shield is one thing; adding a sword to deal with problems directly is another. If you are worried about the negative effects of personalized recommendation algorithms, you could change Section 230 for those algorithms. Or you could write a more targeted consumer protection law for personalized recommendation algorithms (and then carve that law out of Section 230).

With respect to Chinese apps like TikTok—which is a legitimate national security threat, according to former CIA official Klon Kitchen—you could strip their 230 immunity. Or, you could ban TikTok.

One bill that APP did endorse, CASE-IT, is different from other 230 bills. It has two parts. The first changes Section 230's shield. The second adds a sword—a provision that lets users sue market-dominant platforms for their acts of censorship. I'll let others decide whether CASE-IT's sword is good policy, but the bill at least recognizes that Section 230 reform alone cannot solve the problem.

In the world of tech policy, as a debate grows longer, inevitably someone will blame Section 230. As lawmakers discuss the issues arising from Big Tech, however, they should remember that Section 230 reform is a hammer, and not every problem is a nail.

Mike Wacker is a software engineer and technologist who has previously served as tech fellow in Congress.

The views expressed in this article are the writer's own.