Conservatives Must Tackle the Problems of the Digital Revolution | Opinion

Of all the policy debates taking place on the Right, perhaps none tests old assumptions more than the debate over the Digital Revolution. From gig work to the all-powerful algorithms of Big Tech, Silicon Valley innovations have vastly improved elements of our lives—but they have extracted a cost to our social order, ways of engagement and even our general understanding of liberty that is, in many ways, not yet fully understood.

Over at American Compass, these questions are beginning to take substantive shape—first, by defining the issues, and then, by considering the relevant benefits and trade-offs. One of the more intimidating features of the policy debate around so-called "Big Tech" and the Digital Revolution is its sheer scope. What is Big Tech? And how wide and deep is the issue? So it is helpful, as Compass' Executive Director Oren Cass notes, to divide the challenge into constituent parts that, for the present examination, include gig work, the attention economy and black box algorithms.

The problem of the digitally mediated world is framed as a "Super Market"—one in which market forces work almost on overdrive, with downstream consequences that ripple outside of economics and into our social and political structures. For gig work, the tension between innovations around frictionless exchange and the offline consequences of changing the nature of employment are on display. In the attention economy, there is a debate over whether the subtle manipulation of otherwise-independent market behavior poses a problem at all. And as far as algorithms are concerned, a discussion centers on the value and ubiquity of individual data, and if there should be a more tangible nexus between such data and property rights.

It is the debates around the attention economy and algorithms in American Compass' symposium, however, that go straight to the heart of what the Digital Revolution has less tangibly—but perhaps most consequentially—wrought: changing the nature of what it means to be free, both as an individual and as one acting in the marketplace.

To be online, observes Wells King, is to be constantly surveilled. It is also to have your personal choices mined and commodified, repackaged and sold without individual knowledge and very little meaningful consent. While this makes our advertising more precise and useful, it also changes the meaning of privacy—and, as the government gets in on purchasing location data on the secondary market, the nature of the right an individual has to that privacy.

When it comes to algorithms, the issue is even thornier. Our free markets derive their character from the independent, non-coerced transactions across millions of inputs, which in turn allow the market to give us information about what individuals value and thus how to drive future transactions. The rational and freely chosen action of the homo economicus drives our market forward. But algorithms change the equation, constantly nudging individuals across interfaces and at levels of interference so small as to barely register.

The implicit power of suggestion in highly personalized algorithmic ranking, at the ubiquitous level of the internet, now pervades our minds and markets in a manner we have never before experienced. That this is changing the nature of our free markets seems obvious. "Classical economics recognizes external coercion," writes University of Virginia's Matthew B. Crawford, "but has no ground on which to distinguish freedom from internal compulsion."

Google logo
Google logo DENIS CHARLET/AFP via Getty Images

But libertarians who deride the "libertarian paternalism" of public policies designed around the exact same impulses private algorithms are designed to exploit confusingly take no issue with it when it is done by private industry acting at massive scale—despite the fact that the effect, particularly in markets dominated by single actors, may be entirely the same.

Moreover, there are enough flashing red lights as to how these same algorithms induce behavioral change across broad spectrums of society, particularly pertaining to children, to raise concerns. Reason magazine's Peter Suderman takes pains to dismiss such distress as the unjustified hand-wringing of nanny-state moralizers like Reason's favorite antagonist, Sen. Josh Hawley (R-MO). His flippant dismissal, however, belies a whole host of empirical evidence regarding how viral algorithms affect kids and even young adults, which suggests that a modicum of curiosity—and concern—is indeed warranted.

That something might be wrong here is a suspicion that extends beyond Sen. Hawley. At a House Energy and Commerce Committee hearing in March, Ranking Member Cathy McMorris Rodgers (R-WA) called Big Tech's "power to manipulate and harm our children" her "biggest fear as a parent." Following her lead, member after member pointed to the links between social media and teenage suicide, depression and hospitalization.

It is true that the particulars of behavioral usage—what to watch, how much time to spend on a smartphone—cannot be addressed with public policy. The government cannot substitute for a parent simply taking the phone away.

But public policy does have a role in ensuring that parents have the ability to make informed choices, to understand the details of how these algorithms work and to control access to content—particularly when thousands of public schools, after inking deals with Google, are the ones putting Chromebooks and YouTube accounts into the hands of children.

Moreover, our public policy should be clear about what exactly it is endorsing. While child privacy laws exist, Big Tech companies remain serial violators slapped with fines in amounts that are meaningless to billion-dollar conglomerates. And it should be a question for lawmakers whether or not specific algorithmic amplifications—developed by the companies themselves—should be given Section 230 immunity, which was designed to protect these companies from liability for user-generated content, not the content they themselves create.

The digital age, as with many technological epochs before it, has been transformative for how we live, speak, engage and transact. And like those prior ages, public policy is again chasing innovation to ensure that the changes wrought are in line with the principles and values enshrined in our self-government and constitutional order. It is, as Russell Kirk so aptly stated, a task for conservatives "to reconcile...personal freedom with the claims of modern technology, and to try to humanize an age in which [Permanent] Things are in the saddle."

Rachel Bovard is senior director of policy at the Conservative Partnership Institute.

The views expressed in this article are the writer's own.