Twilight Of The Pc Era?

Nicholas Carr seems an unlikely candidate for the technology world's Public Enemy No. 1. A mild-mannered 44-year-old magazine editor and freelance writer, he's spent five years laboring for the Harvard Business Review, not exactly a hotbed of bomb-throwers. But now he finds himself branded a wild-eyed heretic and a threat to the underpinnings of the entire economy. His offense? Penning a 12-page article about the state of information-technology (IT) investment in the corporate world. Why has it jacked up the aggregate blood pressure in Armonk, N.Y., Silicon Valley, Calif., and Redmond, Wash.? Consider the title: "IT Doesn't Matter."

Doesn't matter? Tech consultants have been burned at the stake, even banned from the golf course, for less. Ever since 1979, when Dan Bricklin and Bob Frankston invented the electronic spreadsheet--and changed the way people in the business world worked--the unshakable wisdom in the corridors of commerce has been that nothing could possibly matter more than IT.

As personal computers landed on every desk, the Internet connected everything and an army of mobile devices made every shard of data accessible at any time, there seemed no reason to question the equation that a buck spent on technology would result in a bankroll soon thereafter. And with Moore's Law (which propounds that every 18 months computer power doubles at no extra cost) still going strong, the reigning assumption is that such alchemy will only continue.

Carr begs to differ, claiming, in essence, that the innovations of the last couple of decades have succeeded too well--at least from the point of view of those peddling software. The very ubiquity of computer power makes it unremarkable, he says, and no longer offers a strategic advantage to companies employing it. The big innovations are over, the low-hanging fruit has been picked and "the IT buildout is much closer to its end than its beginning," he writes. More and more, technology that once seemed unique has now been commoditized, and can be bargained for and bought in bulk like office furniture and paper clips. And in a suggestion that chills the soul of an industry based on first-movers and constant upgrades, he advises companies to spend less. "Follow, don't lead," he cautions.

When Carr's article appeared in May, "it was greeted with horror here [in California]," says economist W. Brian Arthur. "It was like saying that Beethoven can't play piano." And the outrage continues. Shane Robison, chief strategist of HP, tells of a meeting with corporate information officers on its advisory board last month. "They were wrapped around the axle by that article," he says. Peter Godfrey, the CTO of 3M, has a typical response: "It's utter nonsense--so far from the truth that it's laughable."

During Microsoft's analyst meeting last summer, Bill Gates was only the first of a parade of executives who, before PowerPointing their plans for an innovation-studded future, felt compelled to issue a Soviet-style repudiation of Carr-think. ("Hogwash!" cried CEO Steve Ballmer.) But Gates and the rest know that it isn't just the word processor (on an iMac!) of a lone bespectacled observer that he has to worry about. Carr's complaint is only one sign that a dangerous idea is afoot in the land, a philosophy of "good enough" when it comes to high tech. In a number of ways, the perpetual Saturday-night blowout in the tech world suddenly looks like Sunday Morning Coming Down. Here are some dispiriting signs:

Spending nose dives. The bleak economy has battered budgets everywhere, and tech buyers are getting by with less. In the post-bubble era, "there's a 'we won't get fooled --again' " attitude, says Gary Beach, publisher of CIO magazine. A report by a Forrester researcher says the dismal spending trend isn't supposed to improve through 2004. And the question that really terrifies tech vendors was asked by Bill Joy, then chief scientist at Sun, at last winter's World Economic Forum in Davos: "What if the reality is that people have already bought most of the stuff they want to own?"

Trouble in PC-land. While consumers have found some reasons to buy new PCs, the corporate world has less incentive. "There's never been such a gap between the IT world and the consumer," says Ray Ozzie, CEO of Groove Networks. "In the corporate world, the bosses want to lock down the desktop so you can't install or change anything. But at home the same users can hook up cameras and music devices, and find new uses for their PCs." Meanwhile, PC makers are increasingly hedging their bets by selling more profitable electronics devices like TVs, cameras and digital jukeboxes.

High-tech dark side. No one disputes the benefits of technology. But people have learned that all too often tech comes with a downside. The biggest problem is security and disaster recovery, which that same Forrester report listed as the No. 1 priority for IT departments. It's an expensive, labor-intensive pursuit that does nothing for productivity, but does keep the systems going. In fact, our reliance on virus-prone computers is itself a scary proposition: what would be the consequence of an Internet blackout? Another dark-side plague is spam. The time spent deleting all the come-ons makes you question the value of e-mail itself (see Soaking in Spam).

Is that all there is? We've had it drilled into us that we should love the increased productivity of high tech. But technology has enabled companies to eliminate jobs or smoothly outsource them to cheap labor in distant lands. And high-tech connectivity makes us available to our employers at any time of day, at any location. "For many people, the productivity is not apparent," says Edward Tenner, author of "Why Things Bite Back." "Despite technology, they're not working shorter hours for more pay. They ask, 'What does productivity mean for me?' Certainly there's been no increase in self-reported happiness."

But while the "end of the PC era" thinking seems to have hit a nerve (and launched a healthy re-examination of where we are in relation to our digital tools) there's another, less dour way of looking at things. Every wave of innovation--the microchip, the PC explosion, the Internet boom--has built on those that came before. And every step of the way, technology touches more people, more deeply. Given that, it's a little ridiculous to insist that the big breakthroughs are far from over--it's actually easier for an act of genius to change everything. For example, a widespread penetration of the Internet, along with more powerful computers with lots of space on their disk drives, set the stage for 19-year-old college freshman Shawn Fanning to shock the world with his peer-to-peer file-sharing program, Napster. In turn, the unfettered party that followed help spread broadband even more widely, sold more computers, kick-started the digital music-player trade and, oh, almost shut down the entertainment industry. Putting piracy aside, the tech world is only beginning to exploit the legal uses of P2P--which in turn will create an environment for more innovations.

That's why the more contemplative people inside the industry view this moment as an opportunity to take stock, but certainly not a fadeout for dizzying technological change. "I've been hearing about the end of innovation since the 386 chip [more than 20 years ago]," says Pat Gelsinger, Intel's chief technology officer. "But we're not about to go backwards." Nick Donofrio, senior VP of IBM, concurs. "Our point of view is that we'll see six magnitudes of improvement in the next 35 years," he says.

Even Mitch Kapor, whose Open Source Applications Foundation is built on the premise that today's high-priced software applications will one day be cheap or free, considers it absurd to imagine the end of big innovations. "Is our software so great now that it can't be radically improved?" he asks.

What are the emerging innovations? Some of them don't really sound earth shattering, but they get CIOs excited: Web services that promise to speed the information flow through a company and eliminate delays in the supply chain. (We'll leave the details to CIO magazine.) One new technology promises to send shock waves through corporate America and eventually alter the lives of consumers: radio-frequency identification, or RFID. The ability to put very cheap sensors on products and track them from manufacture to the consumer--and eventually tag all items so people can keep track of their stuff--will cause a lot of changes. (Privacy advocates are already concerned about the ability of snoopers to look inside your shopping bags.) Wal-Mart, a company that's grown to monster size by embracing technology, is demanding that its suppliers adopt RFID. Developments like these confound Carr's "Follow, don't lead" advice. As Microsoft VP Jeff Raikes says, "Who would you rather be--Wal-Mart or Sears?"

Another compelling development is search technology--the success of Google shows that a business can be built on the ability to instantly locate information. As more and more data are warehoused in cheap storage devices, software to mine them will change not only the way businesses work, but the way we learn, archive and remember.

Microsoft itself has, as you might imagine, its own master plan to keep the good times rolling. This month Bill Gates and his top tech gurus will present a new "core vision" for the company based on what he calls "seamless computing"--a holistic means of using technology that delivers "rich interfaces and new experiences" no matter where you are and what device you use. "It's all about the power of using advanced software to bring computers into your world, rather than forcing you into theirs," says Gates. The flagship for the seamless-computing effort is the next operating system, code-named Longhorn, due to arrive in 2006.

Carr and other proponents of the twilight era have performed a service in puncturing some of the starry-eyed and self-serving cant of industry insiders. But the smart people who buy technology know that sooner or later, something will come along that compels them to bust their budget. Chances are that at this very moment there's some unknown geek making a breakthrough that corporations everywhere will have to understand and utilize--or else choke in the dust of discarded motherboards. And then we'll know, beyond a doubt, how much IT matters.

Join the Discussion