Congress Should Apply Public Accommodation Laws to Big Tech | Opinion

Large tech platforms have used protections under Section 230 of the Communications Decency Act to control the public discourse through their content moderation practices with impunity. Section 230, in part, protects internet platforms from civil liability when they engage in "good faith" blocking of content that either they or their users consider "obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable."

Last week's Florida federal district court ruling in NetChoice v. Moody adds another level of complexity for state legislatures seeking to regulate such Big Tech practices. Namely, the court may have extended the scope of the First Amendment's "compelled speech" doctrine—protection usually reserved for traditional publishers, like newspapers—to apply to online platforms when they block content. Unless the U.S. Court of Appeals for the 11th Circuit rules differently, Section 230 and the First Amendment now protect these platforms from state legislators passing similar types of laws as the one Florida just passed.

Crucially, however, this does not stop Congress itself from appending public accommodation requirements to an internet platform's Section 230 protections. Put simply, if large internet platforms want these legal immunity protections, then they cannot discriminate against a user based on his/her social status or political beliefs. There must be a non-discrimination "stick" to match the legal immunity "carrot."

The overarching problem is that Section 230 is woefully outdated. Worse, court decisions have further insulated platforms from civil liability when they make politically biased determinations. This is why Twitter can get away with blocking the New York Post's legitimate reports on Hunter Biden, or why Facebook can label articles from conservative outlets "false information" but will not do so for liberal-leaning outlets reporting on the same event. These court interpretations have thus allowed Big Tech companies—like Twitter, Facebook and Google—to control and, in many cases, shut down conversations with almost no accountability.

Keep in mind, when Congress enacted Section 230, the internet was nascent. In 1996, internet services—such as web browsers or instant messaging—acted more like traditional common carriers. Moreover, these early tech pioneers did not have as much of a penchant for exerting almost total editorial privileges over their platforms. Instead, those earlier companies, such as AOL and Netscape, were more interested in simply transmitting messages irrespective of their content. Hence, Congress placed Section 230 in Title II of the Communications Act—that is, the "Common Carrier Regulation" section—because Congress intended to apply the legal immunity only to internet companies that merely passed along user-generated messages.

The U.S. Capitol Building is closed to
The U.S. Capitol Building is closed to the public this year during Independence Day celebrations on July 4, 2021 in Washington, D.C. Samuel Corum/Getty Images

The provision's goal was simple: Tech companies should not be held liable for content they neither publish nor control. Congress provided these protections to give internet-based companies the tools to protect users—mainly children—from harmful content and conduct encountered when using their sites. The law also sought to encourage and promote online discourse, more generally. Congress certainly did not intend these companies to use Section 230 protections to advance their political agendas.

It follows that Big Tech curation practices intended to amplify or discourage various viewpoints may fall outside the scope of Section 230's original intended purpose. For example, former Representative Chris Cox (R-CA) and Senator Ron Wyden (D-OR)—generally credited as the authors of Section 230—wrote: "Section 230 is not the source of legal protection for platforms that wish to express a point of view." They go on to say that "[w]hen a website expresses its own opinion, it is, with respect to that expression, a content creator and, under Section 230, not protected against liability for that content."

To foster the law's intended purpose and combat bad case law, Congress should now apply our public accommodation laws—chiefly, the Civil Rights Act of 1964—to these platforms if they wish to continue availing themselves of Section 230's legal immunity protections. Public accommodation laws prohibit an underlying entity, such as a restaurant or in this case a social media platform, from discriminating against customers when offering its service. Public accommodation requirements, as a statutory creation of Congress, can protect consumers who are not otherwise members of a constitutionally protected class. Specifically, Congress can write a new public accommodation law to prevent internet platforms from discriminating against users who express a certain political viewpoint.

Applying this type of public accommodation law on common carrier-like private entities is not a novel concept. For instance, airlines cannot refuse the service of a paying customer based on race, gender or even political affiliation. This is also true for parcel services, such as FedEx or USPS, where federal law prohibits denying service or discriminating against a parcel sender or recipient based on political belief.

Congress needs to take the reins away from courts and Big Tech and establish true internet freedom. One avenue it can take is applying public accommodation laws that bar platforms from discriminating against users because of their political views. Such measures are not only in line with the animating spirit of Section 230, but they affirmatively advance it.

Joel Thayer focuses his law practice on telecommunications, regulatory and transactional matters, as well as privacy and cybersecurity issues. He has represented clients in front of myriad legal and regulatory fora, including the Federal Communications Commission, Federal Trade Commission and federal administrative agencies. Additionally, he has also represented amicus curiae before the United States Supreme Court and advised technology companies on the European Union's General Data Protection Regulation.

The views expressed in this article are the writer's own.