Book Review: The Computer That Ate the World

In a watershed 1909 story, published when science fiction was known still as "scientific romance," E. M. Forster reimagined the Victorian dream of an empire of universal knowledge as a future tyranny gorged on endless information. Citizens of his brave new world lived, contentedly, in bunkerlike quarters serviced by a small, personal terminal for a global, Weblike system that brought the user food, music, visual entertainments, books, articles, and correspondence both anonymous and intimate—and that did not simply satisfy the desire for those things, but intuited and perhaps even created it. The network was called "the Machine," and the story was "The Machine Stops."

Today, the futurist Jaron Lanier warns in his persuasive new manifesto, You Are Not a Gadget, the danger is less that our network of machine intelligence will fail than that it will endure—that Web culture, and its chiliastic faith in the superior wisdom of computers, will triumph. Lanier, who invented the immersive computer environments called virtual reality in the early 1980s and who has helped to shape the very contours of that cocksure culture, is the first great apostate of the Internet era. The colleagues, mentors, and students he calls both "digital Maoists" and "fellow travelers" are forging an Internet future that offers the promise of radical freedom, "but that freedom," he warns, "is more for machines than people."

The Web began, Lanier recalls, as a noncommercial, utopian experiment—"in vast numbers, people did something cooperatively, solely because it was a good idea, and it was beautiful." (On early, ugly home pages, he wistfully reminds us, one would find bits of a professor's ongoing academic work alongside sketches of his pet newt.) But today, those elements have been scrubbed away to reveal a singular and anti-individualistic apparatus of cloud computing and reasoning by "hive mind." Why are we so enamored of Wikipedia, the signal achievement of the Web 2.0 era, when it has channeled so much intellectual energy into a reference project that is, at best, only as good as the book it replaces? Do we value knowledge so little, it excites us only when it is free? For Lanier, this is a design problem, propagated by software developers in the grip of a near-religious fervor. The great ecumenical promise of the early Web, he writes, "has been superseded by a different faith in the centrality of imaginary entities epitomized by the idea that the internet as a whole is coming alive and turning into a superhuman creature."

This faith, known in its most florid form as the Singularity and enormously popular in the digital world, hails the imminent rise of superpowerful artificial intelligence, and is built on the idea "that the world can be understood as a computational process, with people as subprocesses." As our machines get smarter, the thinking goes, consciousness will be revealed as a simple outgrowth of computational magnitude, and computers will attain something like personhood. Soon after, the shibboleth says, they will grow from human to superhuman.

To technologists, this is thrilling, a vision of deliverance out of the human realm and into what Lanier calls, incisively, a "lifeless world of pure information." For Lanier, the Singularity is a cult of self-abdication, embraced only by those who aspire to inconsequentiality. In defining as progress the inability to distinguish between people and computers, are we flattering our technology, he asks, or demeaning our own intelligence?