Social Media Platforms Must be Held Accountable for Illicit Content | Opinion

The need to amend Section 230 is most evident and urgent when it comes to the exploitation of children. A recent lawsuit filed against Twitter for not removing sexually explicit content depicting a minor clearly shows that it and other social media companies must be held accountable for illicit content posted on their platforms. The initial physical abuse is harmful enough, but there is additional harm done every time that abuse is broadcasted and shared again. It is harmful to the victim and harmful to those who consume the perverse images. And those harms spill over into the users' broader communities.

The Department of Justice under Attorney General William Barr put forward a legislative proposal in September 2020 to amend Section 230 to realign its incentives with the goals of the original statutory text—to encourage innovation balanced with appropriate incentives for platforms to remove harmful content. DOJ's proposal aims to incentivize online platforms like Twitter to address illicit content, rather than allow them to hide behind the immunity claims of Section 230.

Illicit online content continues to proliferate and has become increasingly violent and harmful to our nation's youth. Access to this content via the internet and smart devices is ubiquitous. We urgently need platforms to step up and identify and remove such illicit content to protect our children.

Section 230 was originally meant to be not only a shield for internet service providers but also a sword against illicit content, allowing platforms to take down content like pornography without being held liable for doing so. It was meant to protect children in particular from lewd, violent and lascivious online content. Instead, platforms like Twitter have been moderating as they please, removing content they disagree with politically but leaving obviously criminal content alone. The Justice Department's proposal to amend Section 230 seeks to correct this behavior by incentivizing providers to remove illicit content in three main ways.

First, by denying Section 230 immunity to truly bad actors. The title of Section 230's immunity provision—"Protection for 'Good Samaritan' Blocking and Screening of Offensive Material"—makes clear that Section 230 immunity is meant to incentivize and protect responsible online platforms, online "Good Samaritans." Platforms that purposely solicit and facilitate harmful criminal activity—online "Bad Samaritans"—should not receive the benefit of this immunity. This amendment means that bad actors, like Backpage or PornHub, would no longer have immunity under Section 230.

Signal And Telegram Application : Illustration PARIS,
In this photo illustration, the logos of social media applications, Messenger, WeChat, Instagram, WhatsApp, Twitter, MeWe, Telegram, Signal and Facebook are displayed on the screen of an iPhone on January 13, 2021. Since President Donald Trump's removal from several social media site, election misinformation has declined over 70 percent. Photo illustration by Chesnot/Getty Images/Getty

Second, by exempting from immunity specific categories of claims that address particularly egregious content, including child exploitation and sexual abuse, terrorism and cyber-stalking. These specific carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress in causes of action far afield from the original purpose of the statute. This is precisely the type of claim at issue in the recent suit against Twitter, where Twitter refused to remove child exploitation content from its platform. With this legislative fix it would be clear that Twitter would not have any immunity under Section 230 for allowing child sexual exploitation media on its site.

Third, by making clear that Section 230 immunity does not apply where a platform had actual knowledge of third-party content that violated federal criminal law or where the platform was provided with a court judgment that the content is unlawful in any respect. A platform should not receive blanket immunity for continuing to host known criminal content on its services, despite repeated pleas from victims, such as in the recent case against Twitter. In that suit, Twitter had actual knowledge from the victim and his family of the illicit, abusive content being posted and shared repeatedly on its platform and yet took no action to take it down.

As Attorney General William Barr said, "For too long Section 230 has provided a shield for online platforms to operate with impunity. Ensuring that the internet is a safe, but also vibrant, open and competitive environment is vitally important to America."

In the last Congress there were several pieces of legislation introduced on Section 230 reform. One bipartisan bill sponsored by Senator Lindsey Graham (the EARN IT Act) even made it to the floor, but no further action was taken. The rest remained in committee. And while many of these bills put forward good efforts at reforming Section 230, none proposed all of the robust and necessary amendments that the Justice Department recommended. Congress needs to make these reforms to Section 230 to hold online platforms accountable when they unlawfully censor speech, and especially when they knowingly facilitate criminal activity online.

President Joe Biden himself has made statements about the need to revoke Section 230. His administration and Congress need to pay attention to the Justice Department's comprehensive proposal and make a concerted, joint effort to pass legislation amending it. President Biden said he wants to unify the country, but has so far opted to impose highly partisan policies via executive order. Reform for Section 230, especially to protect vulnerable Americans and in particular children, is both urgently necessary and deserving of strong bipartisan consensus.

Clare Morell is an independent researcher specializing in effective criminal justice policies and most recently worked as an advisor to Attorney General Bill Barr at the Department of Justice, where she also served as an editor for the Presidential Commission on Law Enforcement. She lives with her husband and son in Washington, D.C.

The views expressed in this article are the writer's own.