Section 230 Reform Can Force Big Tech To Focus on Real Problems | Opinion

Some conservatives claim that without Section 230—a federal law that shields tech companies from liability for third-party content they host—tech companies will moderate social media posts even more. But the problem isn't that sites are moderating too much; it's that they're moderating the wrong things. If Backpage, a now-defunct classified ads site, had spent more time preventing sex trafficking—instead of turning a blind eye to it—it might still exist today. Targeted, bipartisan reforms to Section 230 can lead to more moderation of real problems and less censorship of fake problems.

Too often, the debate around Section 230 and content moderation conflates two different issues: real problems (like sex trafficking, child porn, and ISIS) where we want more moderation, and fake problems ("misinformation" and "hate speech") where moderation is a clever euphemism for censorship.

Consider the effect that Section 230 had on the real problem of sex trafficking. Backpage frequently ran afoul of many laws, including federal sex trafficking laws. But as far back as 2011, attempts by both victims and states to hold the site accountable went nowhere; Section 230 gave Backpage legal amnesty.

These frustrations reached their peak in 2016, when a circuit court dismissed a lawsuit filed by three child victims of sex trafficking on Backpage—one of whom had been raped more than 1,000 times. Even though victims had made a "persuasive case" that "Backpage has tailored its website to make sex trafficking easier," the judge threw out their lawsuit because Section 230 gave him no other choice.

Congress then passed the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), which carved federal sex trafficking laws out of Section 230. Victims have since used FOSTA to hold Twitter, Facebook, and Pornhub accountable for skirting their legal obligations to prevent sex trafficking.

As for Twitter, the censorship uncovered by the Twitter Files was damning enough. Even more damning, Twitter turned a blind eye to child porn—all while Section 230 provided legal amnesty when victims sued Twitter for violations of federal child porn laws.

When you let woke employees set the priorities for content moderation, you get less moderation of real problems and more censorship of fake problems. When Twitter censored the Hunter Biden laptop story, it used tools that were previously reserved for extreme cases such as child porn. If only it had used those same tools to actually stop child porn.

Reforms to Section 230 cannot directly address the problem of censorship. What they can do, however, is fix the perverse incentive structure.

Social media apps
PARIS, FRANCE - FEBRUARY 20: In this photo illustration, the logos of social media applications, Instagram, WhatsApp, Messenger, WeChat, Signal, Telegram, TikTok, Twitter and Facebook are displayed on the screen of an iPhone on February 20, 2023 in Paris, France. Meta CEO Mark Zuckerberg announced on Sunday the launch of a paid subscription starting at $11.99 per month for users to authenticate their accounts on Meta platforms (Facebook, Instagram). The subscription will first roll out in Australia and New Zealand this week and resembles Twitter's model. Chesnot/Getty Images

Without Section 230, tech companies would moderate more—but only where they have liability under the law. Laws exist for real problems like child porn and sex trafficking, but laws do not exist for fake problems like misinformation and hate speech (such laws would be unconstitutional).

While Big Tech companies have a habit of evading legal accountability, even they cannot escape the laws of economics. If Congress carved child porn laws out of Section 230 by passing the EARN IT Act, tech companies would have to direct more resources to fighting child porn. As a corollary, they would also have fewer resources to tackle fake problems (unless they wanted to increase their budget for content moderation).

More broadly, Republicans and Democrats should figure out where else Section 230 has effectively provided legal amnesty, and then end that legal amnesty. Other federal laws—and in some cases, state laws—should probably be carved out of Section 230 as well.

Congress could target specific laws, such as the Fair Housing Act or the Anti-Terrorism Act. They could even go so far as to end Section 230's legal amnesty for all federal laws.

When proposals are made to carve laws out of Section 230, critics often resort to fearmongering. Instead of addressing real-world scenarios that actual victims have faced, they invent hypothetical ones where the sky will fall without Section 230. These hypotheticals don't accurately reflect the actual obligations that companies have (or don't have) have under these laws.

Every other industry must obey laws governing child pornography and other problems, including the "imperfect" laws; only the tech industry receives legal amnesty. Moreover, even "imperfect" federal laws were still created by a democratically accountable institution where everybody has representation. The same can't be said of the rules created by woke tech employees who only represent a far-left subset of Silicon Valley.

When tech companies don't have to obey the rules created by Congress—rules that all other companies must obey—they will spend that time creating their own set of arbitrary rules instead.

Mike Wacker is a software engineer and technologist who has previously served as tech fellow in Congress.

The views expressed in this article are the writer's own.